Oct 01 10:17:20 crc systemd[1]: Starting Kubernetes Kubelet... Oct 01 10:17:20 crc restorecon[4664]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:20 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:21 crc restorecon[4664]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 10:17:21 crc restorecon[4664]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 01 10:17:21 crc kubenswrapper[4735]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 10:17:21 crc kubenswrapper[4735]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 01 10:17:21 crc kubenswrapper[4735]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 10:17:21 crc kubenswrapper[4735]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 10:17:21 crc kubenswrapper[4735]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 01 10:17:21 crc kubenswrapper[4735]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.667377 4735 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671190 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671210 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671215 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671219 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671224 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671228 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671233 4735 feature_gate.go:330] unrecognized feature gate: Example Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671237 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671240 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671245 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671250 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671256 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671266 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671272 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671276 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671280 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671283 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671288 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671292 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671296 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671299 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671303 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671307 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671310 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671314 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671318 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671322 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671326 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671329 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671333 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671338 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671342 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671346 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671350 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671354 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671358 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671361 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671366 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671369 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671373 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671377 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671381 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671385 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671389 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671393 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671397 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671401 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671404 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671408 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671412 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671415 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671419 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671423 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671427 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671431 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671436 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671440 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671445 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671450 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671453 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671457 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671462 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671466 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671470 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671474 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671477 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671480 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671484 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671487 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671508 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.671512 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671591 4735 flags.go:64] FLAG: --address="0.0.0.0" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671598 4735 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671606 4735 flags.go:64] FLAG: --anonymous-auth="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671612 4735 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671618 4735 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671623 4735 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671628 4735 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671633 4735 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671638 4735 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671642 4735 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671646 4735 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671650 4735 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671654 4735 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671658 4735 flags.go:64] FLAG: --cgroup-root="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671662 4735 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671666 4735 flags.go:64] FLAG: --client-ca-file="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671670 4735 flags.go:64] FLAG: --cloud-config="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671673 4735 flags.go:64] FLAG: --cloud-provider="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671677 4735 flags.go:64] FLAG: --cluster-dns="[]" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671684 4735 flags.go:64] FLAG: --cluster-domain="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671688 4735 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671692 4735 flags.go:64] FLAG: --config-dir="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671695 4735 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671700 4735 flags.go:64] FLAG: --container-log-max-files="5" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671706 4735 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671711 4735 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671715 4735 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671719 4735 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671723 4735 flags.go:64] FLAG: --contention-profiling="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671728 4735 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671732 4735 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671736 4735 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671741 4735 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671747 4735 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671752 4735 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671756 4735 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671761 4735 flags.go:64] FLAG: --enable-load-reader="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671765 4735 flags.go:64] FLAG: --enable-server="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671769 4735 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671775 4735 flags.go:64] FLAG: --event-burst="100" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671779 4735 flags.go:64] FLAG: --event-qps="50" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671783 4735 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671788 4735 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671791 4735 flags.go:64] FLAG: --eviction-hard="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671797 4735 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671802 4735 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671807 4735 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671813 4735 flags.go:64] FLAG: --eviction-soft="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671819 4735 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671825 4735 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671830 4735 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671835 4735 flags.go:64] FLAG: --experimental-mounter-path="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671840 4735 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671845 4735 flags.go:64] FLAG: --fail-swap-on="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671849 4735 flags.go:64] FLAG: --feature-gates="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671854 4735 flags.go:64] FLAG: --file-check-frequency="20s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671858 4735 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671863 4735 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671867 4735 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671871 4735 flags.go:64] FLAG: --healthz-port="10248" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671876 4735 flags.go:64] FLAG: --help="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671880 4735 flags.go:64] FLAG: --hostname-override="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671884 4735 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671889 4735 flags.go:64] FLAG: --http-check-frequency="20s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671894 4735 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671898 4735 flags.go:64] FLAG: --image-credential-provider-config="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671903 4735 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671908 4735 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671913 4735 flags.go:64] FLAG: --image-service-endpoint="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671916 4735 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671921 4735 flags.go:64] FLAG: --kube-api-burst="100" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671926 4735 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671930 4735 flags.go:64] FLAG: --kube-api-qps="50" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671934 4735 flags.go:64] FLAG: --kube-reserved="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671938 4735 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671941 4735 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671946 4735 flags.go:64] FLAG: --kubelet-cgroups="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671949 4735 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671953 4735 flags.go:64] FLAG: --lock-file="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671957 4735 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671961 4735 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671966 4735 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671972 4735 flags.go:64] FLAG: --log-json-split-stream="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671976 4735 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671980 4735 flags.go:64] FLAG: --log-text-split-stream="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671984 4735 flags.go:64] FLAG: --logging-format="text" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671988 4735 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671992 4735 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.671997 4735 flags.go:64] FLAG: --manifest-url="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672001 4735 flags.go:64] FLAG: --manifest-url-header="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672006 4735 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672010 4735 flags.go:64] FLAG: --max-open-files="1000000" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672015 4735 flags.go:64] FLAG: --max-pods="110" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672019 4735 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672023 4735 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672027 4735 flags.go:64] FLAG: --memory-manager-policy="None" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672031 4735 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672035 4735 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672039 4735 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672044 4735 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672053 4735 flags.go:64] FLAG: --node-status-max-images="50" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672058 4735 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672062 4735 flags.go:64] FLAG: --oom-score-adj="-999" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672066 4735 flags.go:64] FLAG: --pod-cidr="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672070 4735 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672076 4735 flags.go:64] FLAG: --pod-manifest-path="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672081 4735 flags.go:64] FLAG: --pod-max-pids="-1" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672085 4735 flags.go:64] FLAG: --pods-per-core="0" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672089 4735 flags.go:64] FLAG: --port="10250" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672093 4735 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672097 4735 flags.go:64] FLAG: --provider-id="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672101 4735 flags.go:64] FLAG: --qos-reserved="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672105 4735 flags.go:64] FLAG: --read-only-port="10255" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672109 4735 flags.go:64] FLAG: --register-node="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672113 4735 flags.go:64] FLAG: --register-schedulable="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672117 4735 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672124 4735 flags.go:64] FLAG: --registry-burst="10" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672128 4735 flags.go:64] FLAG: --registry-qps="5" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672132 4735 flags.go:64] FLAG: --reserved-cpus="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672136 4735 flags.go:64] FLAG: --reserved-memory="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672140 4735 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672145 4735 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672149 4735 flags.go:64] FLAG: --rotate-certificates="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672153 4735 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672157 4735 flags.go:64] FLAG: --runonce="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672161 4735 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672165 4735 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672169 4735 flags.go:64] FLAG: --seccomp-default="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672174 4735 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672178 4735 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672182 4735 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672186 4735 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672191 4735 flags.go:64] FLAG: --storage-driver-password="root" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672195 4735 flags.go:64] FLAG: --storage-driver-secure="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672199 4735 flags.go:64] FLAG: --storage-driver-table="stats" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672202 4735 flags.go:64] FLAG: --storage-driver-user="root" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672206 4735 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672210 4735 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672214 4735 flags.go:64] FLAG: --system-cgroups="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672218 4735 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672224 4735 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672231 4735 flags.go:64] FLAG: --tls-cert-file="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672240 4735 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672246 4735 flags.go:64] FLAG: --tls-min-version="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672249 4735 flags.go:64] FLAG: --tls-private-key-file="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672253 4735 flags.go:64] FLAG: --topology-manager-policy="none" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672257 4735 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672261 4735 flags.go:64] FLAG: --topology-manager-scope="container" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672265 4735 flags.go:64] FLAG: --v="2" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672271 4735 flags.go:64] FLAG: --version="false" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672276 4735 flags.go:64] FLAG: --vmodule="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672280 4735 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672285 4735 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672379 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672383 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672387 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672391 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672396 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672400 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672406 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672410 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672414 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672419 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672425 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672430 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672435 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672440 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672443 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672447 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672451 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672455 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672460 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672463 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672469 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672473 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672477 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672480 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672484 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672504 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672509 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672512 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672516 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672519 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672523 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672526 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672530 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672533 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672537 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672540 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672544 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672549 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672553 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672557 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672560 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672564 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672569 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672573 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672577 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672580 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672584 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672587 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672592 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672596 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672600 4735 feature_gate.go:330] unrecognized feature gate: Example Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672603 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672610 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672614 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672617 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672620 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672624 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672627 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672631 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672634 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672638 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672642 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672645 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672649 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672652 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672655 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672659 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672662 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672666 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672669 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.672677 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.672684 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.681703 4735 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.681800 4735 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682246 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682263 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682268 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682272 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682276 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682280 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682284 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682288 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682292 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682295 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682299 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682303 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682307 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682310 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682314 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682318 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682321 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682325 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682329 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682332 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682336 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682340 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682343 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682346 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682350 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682354 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682357 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682361 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682365 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682371 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682376 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682380 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682384 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682387 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682390 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682394 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682397 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682401 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682406 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682410 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682414 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682418 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682421 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682424 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682431 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682437 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682442 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682447 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682451 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682455 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682460 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682463 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682467 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682471 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682476 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682480 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682483 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682487 4735 feature_gate.go:330] unrecognized feature gate: Example Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682491 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682508 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682514 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682519 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682523 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682528 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682532 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682537 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682543 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682548 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682552 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682556 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682560 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.682567 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682707 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682714 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682719 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682723 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682727 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682731 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682736 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682741 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682746 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682750 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682754 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682758 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682761 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682765 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682768 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682772 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682775 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682779 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682782 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682788 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682792 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682795 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682800 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682805 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682809 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682813 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682818 4735 feature_gate.go:330] unrecognized feature gate: Example Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682821 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682825 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682829 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682832 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682836 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682840 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682843 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682847 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682851 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682855 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682858 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682862 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682867 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682871 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682875 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682879 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682882 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682886 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682890 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682893 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682896 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682900 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682904 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682907 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682911 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682915 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682918 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682922 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682926 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682931 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682935 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682939 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682943 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682947 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682951 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682954 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682958 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682961 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682965 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682968 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682972 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682975 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682979 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.682983 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.682989 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.683905 4735 server.go:940] "Client rotation is on, will bootstrap in background" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.688107 4735 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.688206 4735 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.689464 4735 server.go:997] "Starting client certificate rotation" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.689519 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.690416 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-30 16:17:37.621675885 +0000 UTC Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.690566 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1446h0m15.931114567s for next certificate rotation Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.714404 4735 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.716673 4735 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.734794 4735 log.go:25] "Validated CRI v1 runtime API" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.768573 4735 log.go:25] "Validated CRI v1 image API" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.770732 4735 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.778078 4735 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-01-09-44-10-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.778157 4735 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.809254 4735 manager.go:217] Machine: {Timestamp:2025-10-01 10:17:21.805538444 +0000 UTC m=+0.498359786 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f44ec9f4-8a46-42ac-97b9-caabc07abc52 BootID:2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b9:71:da Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b9:71:da Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b5:3c:da Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:97:b0:ab Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:aa:97:56 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:2f:4a:fd Speed:-1 Mtu:1496} {Name:eth10 MacAddress:b2:83:d6:9a:98:6d Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9e:3d:8e:e4:ae:c8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.809769 4735 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.810055 4735 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.810559 4735 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.810857 4735 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.810922 4735 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.811226 4735 topology_manager.go:138] "Creating topology manager with none policy" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.811245 4735 container_manager_linux.go:303] "Creating device plugin manager" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.811718 4735 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.812325 4735 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.813139 4735 state_mem.go:36] "Initialized new in-memory state store" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.813651 4735 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.817191 4735 kubelet.go:418] "Attempting to sync node with API server" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.817226 4735 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.817307 4735 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.817332 4735 kubelet.go:324] "Adding apiserver pod source" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.817737 4735 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.821804 4735 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.822734 4735 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.824884 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.824980 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:21 crc kubenswrapper[4735]: E1001 10:17:21.825078 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 01 10:17:21 crc kubenswrapper[4735]: E1001 10:17:21.825116 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.825461 4735 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827162 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827215 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827237 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827252 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827274 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827287 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827300 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827322 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827336 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827349 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827389 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.827404 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.828519 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.829307 4735 server.go:1280] "Started kubelet" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.830178 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.830368 4735 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.830401 4735 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.830911 4735 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.831604 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.831914 4735 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.831994 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 20:01:41.726229377 +0000 UTC Oct 01 10:17:21 crc systemd[1]: Started Kubernetes Kubelet. Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.832066 4735 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.832073 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2457h44m19.894160865s for next certificate rotation Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.832044 4735 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.836256 4735 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.836634 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:21 crc kubenswrapper[4735]: E1001 10:17:21.836704 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 01 10:17:21 crc kubenswrapper[4735]: E1001 10:17:21.836762 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.839005 4735 server.go:460] "Adding debug handlers to kubelet server" Oct 01 10:17:21 crc kubenswrapper[4735]: E1001 10:17:21.841822 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.842174 4735 factory.go:55] Registering systemd factory Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.842214 4735 factory.go:221] Registration of the systemd container factory successfully Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.842678 4735 factory.go:153] Registering CRI-O factory Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.842707 4735 factory.go:221] Registration of the crio container factory successfully Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.842814 4735 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.842853 4735 factory.go:103] Registering Raw factory Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.842888 4735 manager.go:1196] Started watching for new ooms in manager Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.844340 4735 manager.go:319] Starting recovery of all containers Oct 01 10:17:21 crc kubenswrapper[4735]: E1001 10:17:21.842927 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a569e2ea92139 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 10:17:21.829257529 +0000 UTC m=+0.522078821,LastTimestamp:2025-10-01 10:17:21.829257529 +0000 UTC m=+0.522078821,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852178 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852242 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852257 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852270 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852283 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852298 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852309 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852321 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852337 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852347 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852378 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852395 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852407 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852421 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852435 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852449 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852462 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852474 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852488 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852520 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852533 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852545 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852556 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852569 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852582 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852597 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852630 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852646 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852658 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852695 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852708 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852739 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852769 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852808 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852821 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852834 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852845 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852858 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852870 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852883 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852928 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852939 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852952 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852964 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.852978 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853033 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853045 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853057 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853086 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853116 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853130 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853141 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853159 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853173 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853186 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853198 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853227 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853243 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853255 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853269 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853281 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853293 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853306 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853319 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853383 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853399 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853411 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.853424 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.856934 4735 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857025 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857090 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857111 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857156 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857180 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857195 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857212 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857227 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857241 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857260 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857275 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857373 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857396 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857437 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857452 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857467 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857484 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857531 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857546 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857595 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857611 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857627 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857641 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857655 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857668 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857682 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857696 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857732 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857744 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857758 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857772 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857786 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857798 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857812 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857829 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.857947 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858046 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858110 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858128 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858148 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858202 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858217 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858232 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858287 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858304 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858320 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858333 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858347 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858365 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858382 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858397 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858434 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858450 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858463 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858476 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858516 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858538 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858551 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858567 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858603 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858617 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858631 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858652 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858665 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858678 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858700 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858716 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858788 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858803 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858839 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858857 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858871 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858883 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858896 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.858915 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859072 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859087 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859106 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859119 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859131 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859150 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859164 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859177 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859245 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859264 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859277 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859289 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859301 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859320 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859334 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859347 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859369 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859389 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859405 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859422 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859448 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859464 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859481 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859522 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859539 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859554 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859594 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859610 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859625 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859640 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859656 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859678 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859698 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859715 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859731 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859749 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859764 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859781 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859800 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859815 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859832 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859848 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859864 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859880 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859897 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859924 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859942 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859959 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859975 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.859992 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860008 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860025 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860040 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860058 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860083 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860102 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860121 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860137 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860155 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860170 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860185 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860199 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860217 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860232 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860248 4735 reconstruct.go:97] "Volume reconstruction finished" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.860260 4735 reconciler.go:26] "Reconciler: start to sync state" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.864457 4735 manager.go:324] Recovery completed Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.876430 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.878904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.878941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.878953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.879922 4735 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.879940 4735 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.879964 4735 state_mem.go:36] "Initialized new in-memory state store" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.893943 4735 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.895658 4735 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.895713 4735 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.895746 4735 kubelet.go:2335] "Starting kubelet main sync loop" Oct 01 10:17:21 crc kubenswrapper[4735]: E1001 10:17:21.895811 4735 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 01 10:17:21 crc kubenswrapper[4735]: W1001 10:17:21.896452 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:21 crc kubenswrapper[4735]: E1001 10:17:21.896524 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.900848 4735 policy_none.go:49] "None policy: Start" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.901784 4735 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.901816 4735 state_mem.go:35] "Initializing new in-memory state store" Oct 01 10:17:21 crc kubenswrapper[4735]: E1001 10:17:21.937329 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.966892 4735 manager.go:334] "Starting Device Plugin manager" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.966999 4735 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.967012 4735 server.go:79] "Starting device plugin registration server" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.967474 4735 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.967510 4735 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.967721 4735 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.967803 4735 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.967811 4735 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 01 10:17:21 crc kubenswrapper[4735]: E1001 10:17:21.982355 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.996603 4735 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.996718 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.997899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.997941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.997955 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.998129 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.998384 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.998430 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.998967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.998996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.999008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.999167 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.999340 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.999385 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.999942 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:21 crc kubenswrapper[4735]: I1001 10:17:21.999963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:21.999978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:21.999984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:21.999992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:21.999999 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.000270 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.000294 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.000426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.000447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.000455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.000428 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.003074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.003113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.003122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.003124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.003196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.003226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.003618 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.003644 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.003786 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.005677 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.005707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.005719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.005741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.005777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.005788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.005932 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.005963 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.006633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.006659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.006668 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:22 crc kubenswrapper[4735]: E1001 10:17:22.043134 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063287 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063322 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063350 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063370 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063392 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063412 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063438 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063464 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063485 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063663 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063705 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063739 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063772 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063800 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.063940 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.067672 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.069101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.069130 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.069138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.069176 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 10:17:22 crc kubenswrapper[4735]: E1001 10:17:22.069680 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165125 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165185 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165216 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165245 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165275 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165306 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165333 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165362 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165392 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165423 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165452 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165538 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165653 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165551 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165691 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165731 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165778 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165790 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165732 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165765 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165787 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165547 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165727 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165796 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165880 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165915 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165940 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.165966 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.166093 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.166149 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.270372 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.271960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.272033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.272047 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.272074 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 10:17:22 crc kubenswrapper[4735]: E1001 10:17:22.272722 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.346764 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.364415 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.378784 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.395243 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.401118 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 10:17:22 crc kubenswrapper[4735]: W1001 10:17:22.408159 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-4fceadcb16e4d1fcab81797db0196475cd3d98281743f72189c20d835b82bd74 WatchSource:0}: Error finding container 4fceadcb16e4d1fcab81797db0196475cd3d98281743f72189c20d835b82bd74: Status 404 returned error can't find the container with id 4fceadcb16e4d1fcab81797db0196475cd3d98281743f72189c20d835b82bd74 Oct 01 10:17:22 crc kubenswrapper[4735]: W1001 10:17:22.416702 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-edb42ea4b960e62cc90617ab40237f3bec2147ac919d3c5ba21a2cdaf5e2b7fb WatchSource:0}: Error finding container edb42ea4b960e62cc90617ab40237f3bec2147ac919d3c5ba21a2cdaf5e2b7fb: Status 404 returned error can't find the container with id edb42ea4b960e62cc90617ab40237f3bec2147ac919d3c5ba21a2cdaf5e2b7fb Oct 01 10:17:22 crc kubenswrapper[4735]: W1001 10:17:22.428708 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-8e0c11b84aac83d73ca67c72f108af9e121abcff1b8134b4f8f378a74284091b WatchSource:0}: Error finding container 8e0c11b84aac83d73ca67c72f108af9e121abcff1b8134b4f8f378a74284091b: Status 404 returned error can't find the container with id 8e0c11b84aac83d73ca67c72f108af9e121abcff1b8134b4f8f378a74284091b Oct 01 10:17:22 crc kubenswrapper[4735]: W1001 10:17:22.443582 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-b7e017285bddbde8b07c75f38443dfd4ba44d73aaab0c9c00067062317fd65f2 WatchSource:0}: Error finding container b7e017285bddbde8b07c75f38443dfd4ba44d73aaab0c9c00067062317fd65f2: Status 404 returned error can't find the container with id b7e017285bddbde8b07c75f38443dfd4ba44d73aaab0c9c00067062317fd65f2 Oct 01 10:17:22 crc kubenswrapper[4735]: E1001 10:17:22.443720 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Oct 01 10:17:22 crc kubenswrapper[4735]: W1001 10:17:22.457001 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-82618765a55f0b60af06a460ece48e1e0b87436d1eac8f7a256b0b37d46f9563 WatchSource:0}: Error finding container 82618765a55f0b60af06a460ece48e1e0b87436d1eac8f7a256b0b37d46f9563: Status 404 returned error can't find the container with id 82618765a55f0b60af06a460ece48e1e0b87436d1eac8f7a256b0b37d46f9563 Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.672836 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.674239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.674272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.674280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.674299 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 10:17:22 crc kubenswrapper[4735]: E1001 10:17:22.674637 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Oct 01 10:17:22 crc kubenswrapper[4735]: W1001 10:17:22.741132 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:22 crc kubenswrapper[4735]: E1001 10:17:22.741215 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.831243 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.900696 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8e0c11b84aac83d73ca67c72f108af9e121abcff1b8134b4f8f378a74284091b"} Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.902042 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4fceadcb16e4d1fcab81797db0196475cd3d98281743f72189c20d835b82bd74"} Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.903070 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"edb42ea4b960e62cc90617ab40237f3bec2147ac919d3c5ba21a2cdaf5e2b7fb"} Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.904208 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"82618765a55f0b60af06a460ece48e1e0b87436d1eac8f7a256b0b37d46f9563"} Oct 01 10:17:22 crc kubenswrapper[4735]: I1001 10:17:22.905362 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b7e017285bddbde8b07c75f38443dfd4ba44d73aaab0c9c00067062317fd65f2"} Oct 01 10:17:22 crc kubenswrapper[4735]: W1001 10:17:22.914917 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:22 crc kubenswrapper[4735]: E1001 10:17:22.915144 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 01 10:17:23 crc kubenswrapper[4735]: W1001 10:17:23.035671 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:23 crc kubenswrapper[4735]: E1001 10:17:23.035760 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 01 10:17:23 crc kubenswrapper[4735]: W1001 10:17:23.037139 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:23 crc kubenswrapper[4735]: E1001 10:17:23.037189 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 01 10:17:23 crc kubenswrapper[4735]: E1001 10:17:23.245228 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.475362 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.476705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.476738 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.476750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.476773 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 10:17:23 crc kubenswrapper[4735]: E1001 10:17:23.477170 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.831874 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.911232 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240" exitCode=0 Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.911315 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240"} Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.911445 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.912628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.912674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.912689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.913365 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3a8e0722008bed37a96ef54cbcd8129d5df9559b709b954910460a9426703a49" exitCode=0 Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.913412 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3a8e0722008bed37a96ef54cbcd8129d5df9559b709b954910460a9426703a49"} Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.913588 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.914594 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.914825 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.914885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.914917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.915352 4735 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e6bb5b2bdb9bd05c3d9a137553994d6169c1a24b1719c3a66dcef2b45678a78b" exitCode=0 Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.915455 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e6bb5b2bdb9bd05c3d9a137553994d6169c1a24b1719c3a66dcef2b45678a78b"} Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.915470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.915556 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.915585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.915462 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.916273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.916295 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.916303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.919567 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.919567 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754"} Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.919784 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f"} Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.919820 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35"} Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.919991 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf"} Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.921251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.921330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.921393 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.922527 4735 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881" exitCode=0 Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.922618 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881"} Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.922687 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.923744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.923784 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:23 crc kubenswrapper[4735]: I1001 10:17:23.923799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.079391 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.537688 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.831609 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:24 crc kubenswrapper[4735]: E1001 10:17:24.846320 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Oct 01 10:17:24 crc kubenswrapper[4735]: W1001 10:17:24.848784 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:24 crc kubenswrapper[4735]: E1001 10:17:24.848939 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 01 10:17:24 crc kubenswrapper[4735]: W1001 10:17:24.901199 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:24 crc kubenswrapper[4735]: E1001 10:17:24.901298 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.942807 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"accb544db17576f62271088d4e35c414c0ed291197cf7d953466d7634edff0dd"} Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.942866 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9e93060fbaf012ed4f8aec1624caf86aa986cea91a7ba97b7fb1dffa87343092"} Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.942881 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"07aa8a6d6d8611abb4a1531fc38c8bbdca73cf532d2f45645fcfb0eaeb92fec9"} Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.942981 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.944809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.944841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.944854 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.948668 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b"} Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.948696 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8"} Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.948708 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba"} Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.948720 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7"} Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.950699 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e141ae3d66c0fc869c1f603084e06bfa2d42b5ac6f23d84a349dd50ad4c424d0" exitCode=0 Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.950775 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e141ae3d66c0fc869c1f603084e06bfa2d42b5ac6f23d84a349dd50ad4c424d0"} Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.950928 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.952048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.952106 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.952125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.955825 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a321a308d85e66b54cf5a08968c8fa2b856b42b021af3ab8cdccbb8c7ac03652"} Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.955904 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.955951 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.957248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.957307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.957335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.957271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.957396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:24 crc kubenswrapper[4735]: I1001 10:17:24.957410 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.077779 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.080262 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.080306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.080322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.080350 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 10:17:25 crc kubenswrapper[4735]: E1001 10:17:25.080813 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Oct 01 10:17:25 crc kubenswrapper[4735]: W1001 10:17:25.197394 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 01 10:17:25 crc kubenswrapper[4735]: E1001 10:17:25.197949 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.310558 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.781814 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.965295 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15"} Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.965462 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.966962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.967028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.967049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.967750 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ff1808c3b0f4dfc3f8ad74de5eb69d61e10a5892a375ed5d6ededec934f6dfae" exitCode=0 Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.967890 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.967916 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.967861 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ff1808c3b0f4dfc3f8ad74de5eb69d61e10a5892a375ed5d6ededec934f6dfae"} Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.967897 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.968209 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.968930 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.968963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.968972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.969146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.969181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.969197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.969544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.969577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.969589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.970176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.970199 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:25 crc kubenswrapper[4735]: I1001 10:17:25.970207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.527343 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973049 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973121 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973170 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e82af6aa8cd80d0817a1ecefb6b240b36ed284906f9ad6bc1fe1a2c176d0a5b4"} Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973232 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973247 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6da221f065e1113df91f511554df0ea4ec4615b1fa4a86c28416e59baf7abc85"} Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973258 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"609ae492f258f9461082bee85100a7cdb74c1f53a10677141fdd2e98e10c0fde"} Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973269 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dc67e1102794df69d3a244c5c84544734e638fa9961c9215aad48b3378a5b49c"} Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973280 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"271515a0b5291da42aca4d2d02fa7cb39bb09248693d575c21ac1607f67d8926"} Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973053 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.973944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.974535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.974554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.974562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.974913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.974929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.974937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.975313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.975350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:26 crc kubenswrapper[4735]: I1001 10:17:26.975361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:27 crc kubenswrapper[4735]: I1001 10:17:27.538282 4735 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 10:17:27 crc kubenswrapper[4735]: I1001 10:17:27.538359 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 10:17:27 crc kubenswrapper[4735]: I1001 10:17:27.976067 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:27 crc kubenswrapper[4735]: I1001 10:17:27.976215 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:27 crc kubenswrapper[4735]: I1001 10:17:27.977954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:27 crc kubenswrapper[4735]: I1001 10:17:27.978017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:27 crc kubenswrapper[4735]: I1001 10:17:27.978040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:27 crc kubenswrapper[4735]: I1001 10:17:27.978311 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:27 crc kubenswrapper[4735]: I1001 10:17:27.978394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:27 crc kubenswrapper[4735]: I1001 10:17:27.978456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:28 crc kubenswrapper[4735]: I1001 10:17:28.281135 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:28 crc kubenswrapper[4735]: I1001 10:17:28.282904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:28 crc kubenswrapper[4735]: I1001 10:17:28.282942 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:28 crc kubenswrapper[4735]: I1001 10:17:28.282958 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:28 crc kubenswrapper[4735]: I1001 10:17:28.282977 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 10:17:28 crc kubenswrapper[4735]: I1001 10:17:28.353339 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:28 crc kubenswrapper[4735]: I1001 10:17:28.978669 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:28 crc kubenswrapper[4735]: I1001 10:17:28.982572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:28 crc kubenswrapper[4735]: I1001 10:17:28.982611 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:28 crc kubenswrapper[4735]: I1001 10:17:28.982620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:29 crc kubenswrapper[4735]: I1001 10:17:29.704174 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 01 10:17:29 crc kubenswrapper[4735]: I1001 10:17:29.704386 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:29 crc kubenswrapper[4735]: I1001 10:17:29.705884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:29 crc kubenswrapper[4735]: I1001 10:17:29.705979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:29 crc kubenswrapper[4735]: I1001 10:17:29.706000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.596057 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.596422 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.597727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.597776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.597793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.600688 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.918610 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.918814 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.920132 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.920164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.920175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:31 crc kubenswrapper[4735]: E1001 10:17:31.982813 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.986635 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.987621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.987645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:31 crc kubenswrapper[4735]: I1001 10:17:31.987653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:35 crc kubenswrapper[4735]: I1001 10:17:35.614628 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 10:17:35 crc kubenswrapper[4735]: I1001 10:17:35.614678 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 10:17:35 crc kubenswrapper[4735]: I1001 10:17:35.618229 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 10:17:35 crc kubenswrapper[4735]: I1001 10:17:35.618275 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 10:17:35 crc kubenswrapper[4735]: I1001 10:17:35.786751 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:35 crc kubenswrapper[4735]: I1001 10:17:35.786892 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:35 crc kubenswrapper[4735]: I1001 10:17:35.788084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:35 crc kubenswrapper[4735]: I1001 10:17:35.788123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:35 crc kubenswrapper[4735]: I1001 10:17:35.788134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:37 crc kubenswrapper[4735]: I1001 10:17:37.538862 4735 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 10:17:37 crc kubenswrapper[4735]: I1001 10:17:37.538931 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 10:17:38 crc kubenswrapper[4735]: I1001 10:17:38.359105 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:38 crc kubenswrapper[4735]: I1001 10:17:38.359360 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:38 crc kubenswrapper[4735]: I1001 10:17:38.360645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:38 crc kubenswrapper[4735]: I1001 10:17:38.360692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:38 crc kubenswrapper[4735]: I1001 10:17:38.360708 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:38 crc kubenswrapper[4735]: I1001 10:17:38.362870 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:39 crc kubenswrapper[4735]: I1001 10:17:39.002001 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:39 crc kubenswrapper[4735]: I1001 10:17:39.002903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:39 crc kubenswrapper[4735]: I1001 10:17:39.002933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:39 crc kubenswrapper[4735]: I1001 10:17:39.002941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:39 crc kubenswrapper[4735]: I1001 10:17:39.728771 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 01 10:17:39 crc kubenswrapper[4735]: I1001 10:17:39.728936 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:39 crc kubenswrapper[4735]: I1001 10:17:39.731609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:39 crc kubenswrapper[4735]: I1001 10:17:39.731674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:39 crc kubenswrapper[4735]: I1001 10:17:39.731684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:39 crc kubenswrapper[4735]: I1001 10:17:39.740605 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.004457 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.005520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.005555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.005567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:40 crc kubenswrapper[4735]: E1001 10:17:40.614327 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.615974 4735 trace.go:236] Trace[1971766157]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 10:17:29.184) (total time: 11431ms): Oct 01 10:17:40 crc kubenswrapper[4735]: Trace[1971766157]: ---"Objects listed" error: 11431ms (10:17:40.615) Oct 01 10:17:40 crc kubenswrapper[4735]: Trace[1971766157]: [11.431586343s] [11.431586343s] END Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.615998 4735 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.616917 4735 trace.go:236] Trace[410331743]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 10:17:29.994) (total time: 10622ms): Oct 01 10:17:40 crc kubenswrapper[4735]: Trace[410331743]: ---"Objects listed" error: 10622ms (10:17:40.616) Oct 01 10:17:40 crc kubenswrapper[4735]: Trace[410331743]: [10.622782349s] [10.622782349s] END Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.616944 4735 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.616999 4735 trace.go:236] Trace[1258758325]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 10:17:30.139) (total time: 10477ms): Oct 01 10:17:40 crc kubenswrapper[4735]: Trace[1258758325]: ---"Objects listed" error: 10477ms (10:17:40.616) Oct 01 10:17:40 crc kubenswrapper[4735]: Trace[1258758325]: [10.477528721s] [10.477528721s] END Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.617014 4735 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.620744 4735 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.622840 4735 trace.go:236] Trace[79130875]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 10:17:26.111) (total time: 14510ms): Oct 01 10:17:40 crc kubenswrapper[4735]: Trace[79130875]: ---"Objects listed" error: 14510ms (10:17:40.622) Oct 01 10:17:40 crc kubenswrapper[4735]: Trace[79130875]: [14.510984014s] [14.510984014s] END Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.622858 4735 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.627581 4735 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.627663 4735 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.628666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.628706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.628715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.628731 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.628740 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:40Z","lastTransitionTime":"2025-10-01T10:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:40 crc kubenswrapper[4735]: E1001 10:17:40.676106 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.681153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.681190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.681198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.681210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.681239 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:40Z","lastTransitionTime":"2025-10-01T10:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:40 crc kubenswrapper[4735]: E1001 10:17:40.691313 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.694598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.694624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.694632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.694655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.694665 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:40Z","lastTransitionTime":"2025-10-01T10:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:40 crc kubenswrapper[4735]: E1001 10:17:40.703470 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.706610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.706646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.706657 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.706671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.706680 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:40Z","lastTransitionTime":"2025-10-01T10:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:40 crc kubenswrapper[4735]: E1001 10:17:40.715977 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.719024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.719058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.719069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.719083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.719092 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:40Z","lastTransitionTime":"2025-10-01T10:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:40 crc kubenswrapper[4735]: E1001 10:17:40.727308 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: E1001 10:17:40.727422 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.728886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.728928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.728937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.728951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.728960 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:40Z","lastTransitionTime":"2025-10-01T10:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.829685 4735 apiserver.go:52] "Watching apiserver" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.831111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.831147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.831158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.831173 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.831183 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:40Z","lastTransitionTime":"2025-10-01T10:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.833519 4735 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.833817 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5n9cx","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.834150 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.834268 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:40 crc kubenswrapper[4735]: E1001 10:17:40.834332 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.834353 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:40 crc kubenswrapper[4735]: E1001 10:17:40.834412 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.834518 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.834871 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.834886 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:40 crc kubenswrapper[4735]: E1001 10:17:40.834924 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.835152 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5n9cx" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.835899 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.836074 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.836748 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.836780 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.836711 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.836957 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.837025 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.837415 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.837530 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.837835 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.838359 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.838480 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.857715 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.882408 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.903431 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.918565 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.929242 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.932803 4735 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.934222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.934272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.934285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.934301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.934313 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:40Z","lastTransitionTime":"2025-10-01T10:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.942292 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53182->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.942348 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53182->192.168.126.11:17697: read: connection reset by peer" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.942749 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53198->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.942882 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:53198->192.168.126.11:17697: read: connection reset by peer" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.943321 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.943380 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.948397 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.967149 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:40 crc kubenswrapper[4735]: I1001 10:17:40.992043 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.008049 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.009759 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15" exitCode=255 Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.009801 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15"} Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.022741 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.022790 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.022811 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.022831 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.022854 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.022877 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.022901 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.022922 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.022941 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.022963 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.022982 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023001 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023022 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023041 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023089 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023108 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023142 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023182 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023203 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023223 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023244 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023248 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023264 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023386 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023408 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023426 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023447 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023464 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023482 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023513 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023529 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023544 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023542 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023562 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023582 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023597 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023616 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023634 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023649 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023664 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023680 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023695 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023711 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023700 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023728 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023746 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023755 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023762 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023775 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023802 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023831 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023855 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023878 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023906 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023929 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023953 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023975 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023994 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024011 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024032 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024052 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024072 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024094 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024114 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024136 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024155 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024174 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024218 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024240 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024262 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024290 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024314 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024336 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024359 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024379 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024398 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024417 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024435 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024458 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024512 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024536 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024559 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024582 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024603 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024624 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024644 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024666 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024687 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024708 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024730 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024752 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024775 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024797 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024817 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024843 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024866 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024899 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024922 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024942 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024963 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024985 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025007 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025030 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025053 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025076 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025122 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025143 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025165 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025187 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025213 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025234 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025258 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025280 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025301 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025322 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025344 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025366 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025390 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025412 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025451 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025474 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025508 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025533 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025557 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025582 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025604 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025627 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025651 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025675 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025698 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025739 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025763 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025793 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025814 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025836 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025860 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025882 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025903 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025927 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025948 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025967 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025989 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026010 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026032 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026056 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026078 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026105 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026126 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026148 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026170 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026193 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026216 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026240 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026263 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026291 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026316 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026340 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026376 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026912 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026937 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026982 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027005 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027044 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027072 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027094 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027134 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027157 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027181 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027220 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027244 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027265 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027304 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027326 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027347 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027387 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027413 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027452 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027474 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027515 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027538 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027559 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027600 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027625 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027648 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027687 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027710 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027733 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027773 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027796 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027816 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027867 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027889 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027912 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027954 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027984 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028028 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028058 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028199 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028226 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028273 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028325 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9-hosts-file\") pod \"node-resolver-5n9cx\" (UID: \"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\") " pod="openshift-dns/node-resolver-5n9cx" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028381 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpxch\" (UniqueName: \"kubernetes.io/projected/17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9-kube-api-access-jpxch\") pod \"node-resolver-5n9cx\" (UID: \"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\") " pod="openshift-dns/node-resolver-5n9cx" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028413 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028480 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028531 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028567 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.030821 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.030869 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.031274 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.031303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.031741 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.031772 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.031858 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.031884 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.031972 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.031999 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.032097 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.032133 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.032149 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.032165 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.032179 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.052051 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.064026 4735 scope.go:117] "RemoveContainer" containerID="7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.064853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.065103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.065122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.065140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.065152 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:41Z","lastTransitionTime":"2025-10-01T10:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.065725 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.070047 4735 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.070746 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.073946 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.074018 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.085479 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.089410 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.079966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.023906 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024001 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.093465 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024036 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024181 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024329 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024449 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024485 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024781 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.024938 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025144 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025294 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025330 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025466 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025521 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.025807 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026016 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026293 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026410 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026602 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.026810 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.027984 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028071 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028344 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028576 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028603 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028727 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028730 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028719 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028870 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028888 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028959 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.028982 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.029249 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.029269 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.029565 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.029568 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.029600 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.029653 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.029768 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.029808 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.029821 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.030139 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.030333 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.030445 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.030552 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.038921 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.039571 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.039839 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.040056 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.040351 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.040658 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.040846 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.040943 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.041136 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.041226 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.041241 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.041255 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.051805 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.059079 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.059541 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.039969 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.065372 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:17:41.565353951 +0000 UTC m=+20.258175213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.066421 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.067692 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.067723 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.067744 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.067746 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.067759 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.067890 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.068117 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.068207 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.068329 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.068335 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.068162 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.068550 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.068567 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.068848 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.068838 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.093864 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.093870 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.068935 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.069215 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.069257 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.069276 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.069547 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.069863 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.070212 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.070368 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.070687 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.070736 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.070756 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.070973 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.071075 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.071188 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.071541 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.071655 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.071812 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.072052 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.072062 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.072645 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.072627 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.072958 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.073370 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.074597 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.074595 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.075659 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.083487 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.084646 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.094376 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.085306 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.085641 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.086126 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.086196 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.086743 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.087029 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.087083 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.087244 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.087810 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.087839 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.087899 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.088100 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.088186 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.088452 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.088673 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.088732 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.089019 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.089035 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.089086 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.089442 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090011 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090154 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090197 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090210 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090287 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090438 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090548 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090582 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090810 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090812 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090929 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.090994 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.091025 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.091070 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.091077 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.091264 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.091322 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.091409 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.091456 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.091481 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.091799 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.092115 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.092191 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.091877 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.092450 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.092813 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.093003 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.093383 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.093528 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.092903 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.069181 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.094946 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.095410 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.095474 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.095596 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:41.595568383 +0000 UTC m=+20.288389645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.096347 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.096374 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.096540 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.096561 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.096572 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.096706 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.096835 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:41.59662059 +0000 UTC m=+20.289441852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.098048 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.098292 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:41.598275595 +0000 UTC m=+20.291096857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.098319 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:41.598312836 +0000 UTC m=+20.291134098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.099865 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.104051 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.104476 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.105164 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.103955 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.104613 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.104975 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.105149 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.105353 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.105536 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.106525 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.106537 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.107034 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.107086 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.107375 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.107717 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.107943 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.108445 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.108551 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.108737 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.109138 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.109597 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.109788 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.111992 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.112277 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.112323 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.112623 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.112919 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.115179 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.119672 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.120621 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.128932 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.132945 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133308 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133381 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9-hosts-file\") pod \"node-resolver-5n9cx\" (UID: \"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\") " pod="openshift-dns/node-resolver-5n9cx" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133419 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpxch\" (UniqueName: \"kubernetes.io/projected/17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9-kube-api-access-jpxch\") pod \"node-resolver-5n9cx\" (UID: \"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\") " pod="openshift-dns/node-resolver-5n9cx" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133483 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133506 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133516 4735 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133525 4735 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133533 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133541 4735 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133551 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133559 4735 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133567 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133575 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133583 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133591 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133599 4735 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133607 4735 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133615 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133625 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133633 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133661 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133670 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133678 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133686 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133694 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133730 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133739 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133746 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133754 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133764 4735 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133772 4735 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133780 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133788 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133814 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133823 4735 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133831 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133840 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133849 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133857 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133864 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133873 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133890 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133899 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133908 4735 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133916 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133924 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133931 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133939 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133947 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133954 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133963 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133970 4735 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133979 4735 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133987 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133994 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134003 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134011 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134019 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134027 4735 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134035 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134044 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134052 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134111 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134123 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9-hosts-file\") pod \"node-resolver-5n9cx\" (UID: \"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\") " pod="openshift-dns/node-resolver-5n9cx" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134059 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134156 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134164 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134172 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134182 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.133390 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134190 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134337 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134354 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134368 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134378 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134386 4735 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134394 4735 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134402 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134410 4735 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134418 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134451 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134460 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134470 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134479 4735 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134488 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134511 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134519 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134665 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134683 4735 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134696 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134707 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134718 4735 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134730 4735 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134770 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134783 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134795 4735 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134837 4735 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134847 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134855 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134863 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134872 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134881 4735 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134890 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134900 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134908 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134916 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134924 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134932 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134941 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134952 4735 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134961 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134970 4735 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134978 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134986 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.134994 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135002 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135010 4735 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135018 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135026 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135035 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135044 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135052 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135060 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135068 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135079 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135087 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135096 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135104 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135112 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135120 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135129 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135137 4735 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135147 4735 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135155 4735 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135162 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135171 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135180 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135189 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135197 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135205 4735 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135213 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135221 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135229 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135237 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135245 4735 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.135255 4735 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136119 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136134 4735 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136143 4735 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136175 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136183 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136192 4735 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136201 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136209 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136241 4735 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136251 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136323 4735 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.136334 4735 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137579 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137632 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137643 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137653 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137681 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137691 4735 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137702 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137711 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137720 4735 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137727 4735 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137749 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.137736 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138509 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138532 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138548 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138560 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138607 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138620 4735 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138632 4735 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138668 4735 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138680 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138691 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138654 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138739 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138753 4735 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138764 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138775 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138786 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.138797 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.139379 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.139394 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.139404 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.139414 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.139422 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.139429 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.139439 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.139448 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.139459 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.147168 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.148193 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpxch\" (UniqueName: \"kubernetes.io/projected/17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9-kube-api-access-jpxch\") pod \"node-resolver-5n9cx\" (UID: \"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\") " pod="openshift-dns/node-resolver-5n9cx" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.149050 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.153930 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.157813 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 10:17:41 crc kubenswrapper[4735]: W1001 10:17:41.162771 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-260362322cd50297a08dca4dfe2aeebe0da38fa646195e9e6eb9b379f85be28f WatchSource:0}: Error finding container 260362322cd50297a08dca4dfe2aeebe0da38fa646195e9e6eb9b379f85be28f: Status 404 returned error can't find the container with id 260362322cd50297a08dca4dfe2aeebe0da38fa646195e9e6eb9b379f85be28f Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.165246 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.168098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.168121 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.168129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.168143 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.168152 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:41Z","lastTransitionTime":"2025-10-01T10:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.171654 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.174297 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 10:17:41 crc kubenswrapper[4735]: W1001 10:17:41.183232 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-945b50555c3a3c3c17b742bfa8ddfb234da597d7a548bd02b1caee06561353d5 WatchSource:0}: Error finding container 945b50555c3a3c3c17b742bfa8ddfb234da597d7a548bd02b1caee06561353d5: Status 404 returned error can't find the container with id 945b50555c3a3c3c17b742bfa8ddfb234da597d7a548bd02b1caee06561353d5 Oct 01 10:17:41 crc kubenswrapper[4735]: W1001 10:17:41.185742 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-91d14803432f234ebb817272265b43e8d89be260c55a3cc3c26bb7bac06a81ee WatchSource:0}: Error finding container 91d14803432f234ebb817272265b43e8d89be260c55a3cc3c26bb7bac06a81ee: Status 404 returned error can't find the container with id 91d14803432f234ebb817272265b43e8d89be260c55a3cc3c26bb7bac06a81ee Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.186658 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5n9cx" Oct 01 10:17:41 crc kubenswrapper[4735]: W1001 10:17:41.211801 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f1afd4_6acd_4c1c_adb3_e60c2ade4aa9.slice/crio-aac10157da504949d94743539f7787b7803b78364cd0e75af2fa19ac326a18a6 WatchSource:0}: Error finding container aac10157da504949d94743539f7787b7803b78364cd0e75af2fa19ac326a18a6: Status 404 returned error can't find the container with id aac10157da504949d94743539f7787b7803b78364cd0e75af2fa19ac326a18a6 Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.239828 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.272639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.272680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.272692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.272709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.272719 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:41Z","lastTransitionTime":"2025-10-01T10:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.375285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.375313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.375322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.375335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.375345 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:41Z","lastTransitionTime":"2025-10-01T10:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.477445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.477481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.477506 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.477519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.477528 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:41Z","lastTransitionTime":"2025-10-01T10:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.579811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.579859 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.579868 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.579884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.579894 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:41Z","lastTransitionTime":"2025-10-01T10:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.643073 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.643150 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.643178 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.643206 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.643230 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643287 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:17:42.64325654 +0000 UTC m=+21.336077803 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643337 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643387 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643404 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:42.643388024 +0000 UTC m=+21.336209336 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643415 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643473 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643387 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643517 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643530 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643539 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:42.643527628 +0000 UTC m=+21.336348960 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643562 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:42.643551498 +0000 UTC m=+21.336372810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643387 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: E1001 10:17:41.643598 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:42.643590319 +0000 UTC m=+21.336411581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.682233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.682278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.682289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.682304 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.682314 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:41Z","lastTransitionTime":"2025-10-01T10:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.784572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.784606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.784616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.784632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.784641 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:41Z","lastTransitionTime":"2025-10-01T10:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.886726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.886780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.886792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.886807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.886818 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:41Z","lastTransitionTime":"2025-10-01T10:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.899399 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.899904 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.900666 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.901226 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.901792 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.902241 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.903786 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.904281 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.905228 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.905809 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.906662 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.907269 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.907775 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.908615 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.909096 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.909930 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.910427 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.910834 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.911769 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.912279 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.913112 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.913646 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.914043 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.914037 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.914995 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.915356 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.916324 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.917133 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.917952 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.920348 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.922401 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.922864 4735 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.922958 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.926659 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.927411 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.927843 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.928402 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.930832 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.931510 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.932514 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.933208 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.934254 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.934784 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.935794 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.936412 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.937582 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.938014 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.938921 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.939399 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.940474 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.940965 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.941885 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.942317 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.942824 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.942789 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.943747 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.944182 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.957656 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.971161 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.985731 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.988798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.988859 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.988883 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.989204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:41 crc kubenswrapper[4735]: I1001 10:17:41.989241 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:41Z","lastTransitionTime":"2025-10-01T10:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.005141 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.013487 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.013566 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.013580 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"91d14803432f234ebb817272265b43e8d89be260c55a3cc3c26bb7bac06a81ee"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.015241 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5n9cx" event={"ID":"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9","Type":"ContainerStarted","Data":"ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.015295 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5n9cx" event={"ID":"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9","Type":"ContainerStarted","Data":"aac10157da504949d94743539f7787b7803b78364cd0e75af2fa19ac326a18a6"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.016223 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"945b50555c3a3c3c17b742bfa8ddfb234da597d7a548bd02b1caee06561353d5"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.017598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.017625 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"260362322cd50297a08dca4dfe2aeebe0da38fa646195e9e6eb9b379f85be28f"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.019161 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.020380 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.020933 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.021160 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.032249 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.043004 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.053098 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.063353 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.074238 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.087440 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.091456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.091476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.091484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.091514 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.091524 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:42Z","lastTransitionTime":"2025-10-01T10:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.106966 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.118391 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.193653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.193691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.193699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.193715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.193727 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:42Z","lastTransitionTime":"2025-10-01T10:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.296038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.296077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.296086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.296100 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.296109 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:42Z","lastTransitionTime":"2025-10-01T10:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.398974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.399009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.399018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.399043 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.399054 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:42Z","lastTransitionTime":"2025-10-01T10:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.501226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.501251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.501258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.501270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.501278 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:42Z","lastTransitionTime":"2025-10-01T10:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.538235 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8dz9b"] Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.538582 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xgg24"] Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.538761 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.538805 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6qlsd"] Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.539119 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.540031 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: W1001 10:17:42.541773 4735 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.541808 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 10:17:42 crc kubenswrapper[4735]: W1001 10:17:42.541881 4735 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.541912 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.541913 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.542259 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.542323 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.543619 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.543660 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.543749 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.543762 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.544599 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.544864 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.545270 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.560157 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.576554 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.596373 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.603402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.603441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.603450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.603463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.603475 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:42Z","lastTransitionTime":"2025-10-01T10:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.617375 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.634465 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.650479 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.650570 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c2fdbf0-2469-4ca0-8624-d63609123cd1-mcd-auth-proxy-config\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.650593 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-run-multus-certs\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.650608 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr2m9\" (UniqueName: \"kubernetes.io/projected/8c2fdbf0-2469-4ca0-8624-d63609123cd1-kube-api-access-gr2m9\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.650621 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-system-cni-dir\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.650635 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-cni-dir\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.650676 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:17:44.650648855 +0000 UTC m=+23.343470117 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.650738 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.650780 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.650812 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-conf-dir\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.650839 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.650858 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.650868 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.650887 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.650839 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8c2fdbf0-2469-4ca0-8624-d63609123cd1-rootfs\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.650905 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:44.650896312 +0000 UTC m=+23.343717574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.650976 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/63cf5011-57e9-44fa-a662-f30391ef1ff8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.651020 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:44.650987355 +0000 UTC m=+23.343808667 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651153 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-var-lib-cni-bin\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651173 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-hostroot\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651188 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-system-cni-dir\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651206 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63cf5011-57e9-44fa-a662-f30391ef1ff8-cni-binary-copy\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651228 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651269 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651335 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-var-lib-kubelet\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.651355 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.651369 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.651379 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.651392 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.651439 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:44.651426646 +0000 UTC m=+23.344247898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651454 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-var-lib-cni-multus\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.651462 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:44.651454487 +0000 UTC m=+23.344275749 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651480 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-daemon-config\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651510 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6hlq\" (UniqueName: \"kubernetes.io/projected/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-kube-api-access-v6hlq\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651526 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-cnibin\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651540 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-os-release\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651553 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-etc-kubernetes\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651567 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-cnibin\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651582 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-os-release\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651602 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-run-netns\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651637 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-run-k8s-cni-cncf-io\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651657 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntcth\" (UniqueName: \"kubernetes.io/projected/63cf5011-57e9-44fa-a662-f30391ef1ff8-kube-api-access-ntcth\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651678 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-cni-binary-copy\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651700 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-socket-dir-parent\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651726 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c2fdbf0-2469-4ca0-8624-d63609123cd1-proxy-tls\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.651742 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.653085 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.667665 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.679308 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.690644 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.698758 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.705842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.705885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.705897 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.705916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.705927 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:42Z","lastTransitionTime":"2025-10-01T10:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.708608 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.718727 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.730744 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.748396 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752267 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntcth\" (UniqueName: \"kubernetes.io/projected/63cf5011-57e9-44fa-a662-f30391ef1ff8-kube-api-access-ntcth\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752306 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752331 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-cni-binary-copy\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752350 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-socket-dir-parent\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752374 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c2fdbf0-2469-4ca0-8624-d63609123cd1-proxy-tls\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752395 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c2fdbf0-2469-4ca0-8624-d63609123cd1-mcd-auth-proxy-config\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752416 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-system-cni-dir\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752436 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-run-multus-certs\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr2m9\" (UniqueName: \"kubernetes.io/projected/8c2fdbf0-2469-4ca0-8624-d63609123cd1-kube-api-access-gr2m9\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752487 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-cni-dir\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752538 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-conf-dir\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752561 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8c2fdbf0-2469-4ca0-8624-d63609123cd1-rootfs\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752583 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/63cf5011-57e9-44fa-a662-f30391ef1ff8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752605 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-var-lib-cni-bin\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752625 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-hostroot\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752642 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-system-cni-dir\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752662 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63cf5011-57e9-44fa-a662-f30391ef1ff8-cni-binary-copy\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752707 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-var-lib-kubelet\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752728 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-cnibin\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752748 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-var-lib-cni-multus\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752774 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-daemon-config\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752795 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6hlq\" (UniqueName: \"kubernetes.io/projected/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-kube-api-access-v6hlq\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752795 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-system-cni-dir\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752813 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-os-release\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752836 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-etc-kubernetes\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752854 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-cnibin\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752867 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-hostroot\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752879 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-os-release\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752903 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-run-k8s-cni-cncf-io\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752907 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-run-multus-certs\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-run-netns\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.752983 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-run-netns\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.753020 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-system-cni-dir\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.753224 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-cni-dir\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.753278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-conf-dir\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.753317 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8c2fdbf0-2469-4ca0-8624-d63609123cd1-rootfs\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.753693 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/63cf5011-57e9-44fa-a662-f30391ef1ff8-cni-binary-copy\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.753745 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-var-lib-kubelet\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.753785 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-cnibin\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.753813 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-var-lib-cni-multus\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.754037 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/63cf5011-57e9-44fa-a662-f30391ef1ff8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.754120 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-var-lib-cni-bin\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.754197 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-etc-kubernetes\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.754304 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-daemon-config\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.754347 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-cnibin\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.754396 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-os-release\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.754424 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-host-run-k8s-cni-cncf-io\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.754468 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-os-release\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.754652 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-cni-binary-copy\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.754738 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-multus-socket-dir-parent\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.755167 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/63cf5011-57e9-44fa-a662-f30391ef1ff8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.755266 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8c2fdbf0-2469-4ca0-8624-d63609123cd1-mcd-auth-proxy-config\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.767034 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.782866 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6hlq\" (UniqueName: \"kubernetes.io/projected/5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7-kube-api-access-v6hlq\") pod \"multus-8dz9b\" (UID: \"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\") " pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.784994 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.785529 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntcth\" (UniqueName: \"kubernetes.io/projected/63cf5011-57e9-44fa-a662-f30391ef1ff8-kube-api-access-ntcth\") pod \"multus-additional-cni-plugins-6qlsd\" (UID: \"63cf5011-57e9-44fa-a662-f30391ef1ff8\") " pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.795517 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.808099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.808149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.808158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.808171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.808180 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:42Z","lastTransitionTime":"2025-10-01T10:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.812782 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.823675 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.836728 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.852721 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8dz9b" Oct 01 10:17:42 crc kubenswrapper[4735]: W1001 10:17:42.863177 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d8707c5_6fad_4ba7_b2ea_a0916dd86bf7.slice/crio-a10178b8dde23b5df1d9c14da854a4f105a72a1bfc653780fdd9cfbd85d86442 WatchSource:0}: Error finding container a10178b8dde23b5df1d9c14da854a4f105a72a1bfc653780fdd9cfbd85d86442: Status 404 returned error can't find the container with id a10178b8dde23b5df1d9c14da854a4f105a72a1bfc653780fdd9cfbd85d86442 Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.870313 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.896252 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.896375 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.897366 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.897467 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.897648 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:42 crc kubenswrapper[4735]: E1001 10:17:42.897735 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.911266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.911341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.911353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.911379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.911408 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:42Z","lastTransitionTime":"2025-10-01T10:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.918537 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-k5mgz"] Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.919296 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.923369 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.923556 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.923554 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.923682 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.924117 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.939594 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.939630 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 01 10:17:42 crc kubenswrapper[4735]: I1001 10:17:42.990586 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.007344 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:43Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.013863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.013892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.013902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.013915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.013926 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:43Z","lastTransitionTime":"2025-10-01T10:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.023269 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dz9b" event={"ID":"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7","Type":"ContainerStarted","Data":"cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.023309 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dz9b" event={"ID":"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7","Type":"ContainerStarted","Data":"a10178b8dde23b5df1d9c14da854a4f105a72a1bfc653780fdd9cfbd85d86442"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.024608 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" event={"ID":"63cf5011-57e9-44fa-a662-f30391ef1ff8","Type":"ContainerStarted","Data":"9c7954c2527a0c2b656f9e90319374739e6a8fcb3dab1af833e2734a9cb9e265"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.034794 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:43Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.048927 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:43Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.054679 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-env-overrides\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.054794 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-openvswitch\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.054864 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-netns\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.054890 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-node-log\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.054926 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-ovn\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.054951 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-netd\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.054983 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-bin\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055031 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-kubelet\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055048 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-config\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055062 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32f1531f-5034-48d4-b694-efc774226e37-ovn-node-metrics-cert\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055094 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055110 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-systemd\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055126 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-log-socket\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055142 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-systemd-units\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055155 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-etc-openvswitch\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055172 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-var-lib-openvswitch\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055186 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqpq8\" (UniqueName: \"kubernetes.io/projected/32f1531f-5034-48d4-b694-efc774226e37-kube-api-access-fqpq8\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055208 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-ovn-kubernetes\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055223 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-slash\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.055242 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-script-lib\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.062074 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:43Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.076678 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:43Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.086673 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:43Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.105256 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:43Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.116638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.116676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.116684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.116697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.116707 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:43Z","lastTransitionTime":"2025-10-01T10:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.118002 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:43Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.134238 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:43Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.146991 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:43Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156241 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156329 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-systemd\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156345 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-log-socket\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156365 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-systemd-units\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156367 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156422 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-systemd-units\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156431 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-systemd\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-etc-openvswitch\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156460 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-log-socket\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156379 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-etc-openvswitch\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156598 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-var-lib-openvswitch\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156622 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqpq8\" (UniqueName: \"kubernetes.io/projected/32f1531f-5034-48d4-b694-efc774226e37-kube-api-access-fqpq8\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156663 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-ovn-kubernetes\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156686 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-script-lib\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156707 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-slash\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156726 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-env-overrides\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156731 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-ovn-kubernetes\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156751 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-openvswitch\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156778 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-openvswitch\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156793 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-netns\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156802 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-slash\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.156826 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-node-log\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.157812 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-script-lib\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.159954 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:43Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.160009 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-var-lib-openvswitch\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.160420 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-env-overrides\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.160504 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-ovn\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.160533 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-netd\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.160556 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-bin\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.160574 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-kubelet\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.160597 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32f1531f-5034-48d4-b694-efc774226e37-ovn-node-metrics-cert\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.160619 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-config\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.161087 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-config\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.161137 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-ovn\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.161172 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-netd\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.161208 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-bin\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.161243 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-kubelet\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.162345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-netns\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.162362 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-node-log\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.164562 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32f1531f-5034-48d4-b694-efc774226e37-ovn-node-metrics-cert\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.175524 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqpq8\" (UniqueName: \"kubernetes.io/projected/32f1531f-5034-48d4-b694-efc774226e37-kube-api-access-fqpq8\") pod \"ovnkube-node-k5mgz\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.218577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.218620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.218633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.218649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.218661 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:43Z","lastTransitionTime":"2025-10-01T10:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.251512 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:43 crc kubenswrapper[4735]: W1001 10:17:43.266462 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32f1531f_5034_48d4_b694_efc774226e37.slice/crio-e73b6c84ab970e729df6b604555f77f77039617e883743768caa22d97e5dd9fc WatchSource:0}: Error finding container e73b6c84ab970e729df6b604555f77f77039617e883743768caa22d97e5dd9fc: Status 404 returned error can't find the container with id e73b6c84ab970e729df6b604555f77f77039617e883743768caa22d97e5dd9fc Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.321117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.321155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.321166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.321183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.321193 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:43Z","lastTransitionTime":"2025-10-01T10:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.422886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.422933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.422947 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.422963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.422973 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:43Z","lastTransitionTime":"2025-10-01T10:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.525643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.525676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.525685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.525697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.525706 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:43Z","lastTransitionTime":"2025-10-01T10:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.628337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.628367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.628378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.628395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.628409 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:43Z","lastTransitionTime":"2025-10-01T10:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.721046 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.730244 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.730529 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.730639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.730745 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.730840 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:43Z","lastTransitionTime":"2025-10-01T10:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.732986 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr2m9\" (UniqueName: \"kubernetes.io/projected/8c2fdbf0-2469-4ca0-8624-d63609123cd1-kube-api-access-gr2m9\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:43 crc kubenswrapper[4735]: E1001 10:17:43.755275 4735 secret.go:188] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Oct 01 10:17:43 crc kubenswrapper[4735]: E1001 10:17:43.755375 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c2fdbf0-2469-4ca0-8624-d63609123cd1-proxy-tls podName:8c2fdbf0-2469-4ca0-8624-d63609123cd1 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:44.255347231 +0000 UTC m=+22.948168493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8c2fdbf0-2469-4ca0-8624-d63609123cd1-proxy-tls") pod "machine-config-daemon-xgg24" (UID: "8c2fdbf0-2469-4ca0-8624-d63609123cd1") : failed to sync secret cache: timed out waiting for the condition Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.832667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.832707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.832716 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.832732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.832741 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:43Z","lastTransitionTime":"2025-10-01T10:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.934731 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.934767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.934775 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.934788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:43 crc kubenswrapper[4735]: I1001 10:17:43.934797 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:43Z","lastTransitionTime":"2025-10-01T10:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.029338 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.030821 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1" exitCode=0 Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.030903 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.030934 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"e73b6c84ab970e729df6b604555f77f77039617e883743768caa22d97e5dd9fc"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.033107 4735 generic.go:334] "Generic (PLEG): container finished" podID="63cf5011-57e9-44fa-a662-f30391ef1ff8" containerID="440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8" exitCode=0 Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.033627 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" event={"ID":"63cf5011-57e9-44fa-a662-f30391ef1ff8","Type":"ContainerDied","Data":"440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.036317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.036352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.036362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.036375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.036385 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:44Z","lastTransitionTime":"2025-10-01T10:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.049428 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zdmp4"] Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.049783 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zdmp4" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.051688 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.051934 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.051977 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.053006 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.053675 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.073000 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.089782 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.108384 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.119653 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.125251 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.135819 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.138844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.138877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.138885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.138899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.138908 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:44Z","lastTransitionTime":"2025-10-01T10:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.147544 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.159754 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.169733 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14eb0416-c94e-4b5c-824e-720abd2fe3f2-host\") pod \"node-ca-zdmp4\" (UID: \"14eb0416-c94e-4b5c-824e-720abd2fe3f2\") " pod="openshift-image-registry/node-ca-zdmp4" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.169794 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/14eb0416-c94e-4b5c-824e-720abd2fe3f2-serviceca\") pod \"node-ca-zdmp4\" (UID: \"14eb0416-c94e-4b5c-824e-720abd2fe3f2\") " pod="openshift-image-registry/node-ca-zdmp4" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.169820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prqcf\" (UniqueName: \"kubernetes.io/projected/14eb0416-c94e-4b5c-824e-720abd2fe3f2-kube-api-access-prqcf\") pod \"node-ca-zdmp4\" (UID: \"14eb0416-c94e-4b5c-824e-720abd2fe3f2\") " pod="openshift-image-registry/node-ca-zdmp4" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.171912 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.187750 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.198638 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.215942 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.235127 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.241580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.241623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.241634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.241652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.241665 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:44Z","lastTransitionTime":"2025-10-01T10:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.249153 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.260907 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.272667 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.273934 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prqcf\" (UniqueName: \"kubernetes.io/projected/14eb0416-c94e-4b5c-824e-720abd2fe3f2-kube-api-access-prqcf\") pod \"node-ca-zdmp4\" (UID: \"14eb0416-c94e-4b5c-824e-720abd2fe3f2\") " pod="openshift-image-registry/node-ca-zdmp4" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.274004 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14eb0416-c94e-4b5c-824e-720abd2fe3f2-host\") pod \"node-ca-zdmp4\" (UID: \"14eb0416-c94e-4b5c-824e-720abd2fe3f2\") " pod="openshift-image-registry/node-ca-zdmp4" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.274023 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c2fdbf0-2469-4ca0-8624-d63609123cd1-proxy-tls\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.274040 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/14eb0416-c94e-4b5c-824e-720abd2fe3f2-serviceca\") pod \"node-ca-zdmp4\" (UID: \"14eb0416-c94e-4b5c-824e-720abd2fe3f2\") " pod="openshift-image-registry/node-ca-zdmp4" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.274169 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14eb0416-c94e-4b5c-824e-720abd2fe3f2-host\") pod \"node-ca-zdmp4\" (UID: \"14eb0416-c94e-4b5c-824e-720abd2fe3f2\") " pod="openshift-image-registry/node-ca-zdmp4" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.274823 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/14eb0416-c94e-4b5c-824e-720abd2fe3f2-serviceca\") pod \"node-ca-zdmp4\" (UID: \"14eb0416-c94e-4b5c-824e-720abd2fe3f2\") " pod="openshift-image-registry/node-ca-zdmp4" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.280298 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8c2fdbf0-2469-4ca0-8624-d63609123cd1-proxy-tls\") pod \"machine-config-daemon-xgg24\" (UID: \"8c2fdbf0-2469-4ca0-8624-d63609123cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.292633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prqcf\" (UniqueName: \"kubernetes.io/projected/14eb0416-c94e-4b5c-824e-720abd2fe3f2-kube-api-access-prqcf\") pod \"node-ca-zdmp4\" (UID: \"14eb0416-c94e-4b5c-824e-720abd2fe3f2\") " pod="openshift-image-registry/node-ca-zdmp4" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.294319 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.309088 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.321974 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.332808 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.343987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.344035 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.344047 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.344064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.344075 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:44Z","lastTransitionTime":"2025-10-01T10:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.346153 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.359572 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.361733 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.364541 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zdmp4" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.372684 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: W1001 10:17:44.377344 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c2fdbf0_2469_4ca0_8624_d63609123cd1.slice/crio-ef5834f0c686dc03bf8771dabef68f3ace8291d90e2e90bc1104d61a264c1ce3 WatchSource:0}: Error finding container ef5834f0c686dc03bf8771dabef68f3ace8291d90e2e90bc1104d61a264c1ce3: Status 404 returned error can't find the container with id ef5834f0c686dc03bf8771dabef68f3ace8291d90e2e90bc1104d61a264c1ce3 Oct 01 10:17:44 crc kubenswrapper[4735]: W1001 10:17:44.381691 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14eb0416_c94e_4b5c_824e_720abd2fe3f2.slice/crio-dacdea2fda1d8e06d955991aba729245a3b3a57ec9500f29ee06192ca9286433 WatchSource:0}: Error finding container dacdea2fda1d8e06d955991aba729245a3b3a57ec9500f29ee06192ca9286433: Status 404 returned error can't find the container with id dacdea2fda1d8e06d955991aba729245a3b3a57ec9500f29ee06192ca9286433 Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.394311 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.408604 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.446568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.446603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.446612 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.446624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.446633 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:44Z","lastTransitionTime":"2025-10-01T10:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.542385 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.545936 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.548582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.548620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.548631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.548647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.548658 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:44Z","lastTransitionTime":"2025-10-01T10:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.550891 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.552524 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.568851 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.588536 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.604139 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.617923 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.630925 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.642986 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.651063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.651096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.651104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.651118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.651156 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:44Z","lastTransitionTime":"2025-10-01T10:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.659416 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.671905 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.677829 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.677927 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.677986 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:17:48.677960001 +0000 UTC m=+27.370781263 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678017 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678032 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678042 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.678046 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678081 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:48.678068013 +0000 UTC m=+27.370889275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.678100 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.678121 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678134 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678167 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:48.678161036 +0000 UTC m=+27.370982298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678176 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678207 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:48.678200017 +0000 UTC m=+27.371021279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678251 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678272 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678284 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.678329 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:48.678316141 +0000 UTC m=+27.371137503 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.685690 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.700076 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.714481 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.728638 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.740828 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.753709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.753758 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.753769 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.753784 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.753796 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:44Z","lastTransitionTime":"2025-10-01T10:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.754586 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.765507 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.776889 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.793731 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.833191 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.854970 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.857263 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.857309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.857320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.857337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.857351 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:44Z","lastTransitionTime":"2025-10-01T10:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.872687 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.892968 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.895950 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.896041 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.896070 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.896116 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.896262 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:17:44 crc kubenswrapper[4735]: E1001 10:17:44.896324 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.905803 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.919518 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.931197 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.941659 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.951406 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:44Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.959763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.959802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.959813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.959835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:44 crc kubenswrapper[4735]: I1001 10:17:44.959847 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:44Z","lastTransitionTime":"2025-10-01T10:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.038257 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" event={"ID":"63cf5011-57e9-44fa-a662-f30391ef1ff8","Type":"ContainerStarted","Data":"de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.039791 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.039816 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.039827 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"ef5834f0c686dc03bf8771dabef68f3ace8291d90e2e90bc1104d61a264c1ce3"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.041034 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zdmp4" event={"ID":"14eb0416-c94e-4b5c-824e-720abd2fe3f2","Type":"ContainerStarted","Data":"b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.041059 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zdmp4" event={"ID":"14eb0416-c94e-4b5c-824e-720abd2fe3f2","Type":"ContainerStarted","Data":"dacdea2fda1d8e06d955991aba729245a3b3a57ec9500f29ee06192ca9286433"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.045152 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.045201 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.045214 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.045231 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.045243 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.045256 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.053719 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.062047 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.062083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.062092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.062105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.062114 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:45Z","lastTransitionTime":"2025-10-01T10:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.064771 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.085588 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.100572 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.112037 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.122081 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.132389 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.141020 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.151988 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.163248 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.164538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.164564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.164572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.164585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.164595 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:45Z","lastTransitionTime":"2025-10-01T10:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.175156 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.189252 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.199223 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.211304 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.224336 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.237175 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.257892 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.267030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.267077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.267088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.267107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.267119 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:45Z","lastTransitionTime":"2025-10-01T10:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.299994 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.339093 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.368938 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.368973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.368981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.368995 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.369006 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:45Z","lastTransitionTime":"2025-10-01T10:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.377596 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.421571 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.460773 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.471529 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.471572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.471583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.471601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.471612 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:45Z","lastTransitionTime":"2025-10-01T10:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.502893 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.538528 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.573865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.573912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.573927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.573944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.573955 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:45Z","lastTransitionTime":"2025-10-01T10:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.582612 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.622378 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.658178 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.676841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.676885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.676899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.676918 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.676931 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:45Z","lastTransitionTime":"2025-10-01T10:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.705435 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:45Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.778967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.779013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.779024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.779042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.779054 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:45Z","lastTransitionTime":"2025-10-01T10:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.880905 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.880955 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.880965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.880981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.880992 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:45Z","lastTransitionTime":"2025-10-01T10:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.984060 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.984103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.984113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.984127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:45 crc kubenswrapper[4735]: I1001 10:17:45.984139 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:45Z","lastTransitionTime":"2025-10-01T10:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.049786 4735 generic.go:334] "Generic (PLEG): container finished" podID="63cf5011-57e9-44fa-a662-f30391ef1ff8" containerID="de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8" exitCode=0 Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.049837 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" event={"ID":"63cf5011-57e9-44fa-a662-f30391ef1ff8","Type":"ContainerDied","Data":"de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8"} Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.065984 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.085732 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.086242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.086280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.086292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.086312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.086326 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:46Z","lastTransitionTime":"2025-10-01T10:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.097644 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.109654 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.120985 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.135046 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.148689 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.159346 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.177533 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.189370 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.190813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.190875 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.190886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.190925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.190937 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:46Z","lastTransitionTime":"2025-10-01T10:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.200572 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.210886 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.224488 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.259806 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:46Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.294685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.294717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.294726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.294741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.294754 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:46Z","lastTransitionTime":"2025-10-01T10:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.396750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.397049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.397059 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.397074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.397084 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:46Z","lastTransitionTime":"2025-10-01T10:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.499841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.499893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.500092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.500109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.500122 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:46Z","lastTransitionTime":"2025-10-01T10:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.602623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.602655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.602672 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.602687 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.602700 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:46Z","lastTransitionTime":"2025-10-01T10:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.704978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.705010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.705018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.705032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.705041 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:46Z","lastTransitionTime":"2025-10-01T10:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.807119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.807182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.807205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.807233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.807254 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:46Z","lastTransitionTime":"2025-10-01T10:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.896882 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.896927 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.896967 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:46 crc kubenswrapper[4735]: E1001 10:17:46.897027 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:17:46 crc kubenswrapper[4735]: E1001 10:17:46.897113 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:17:46 crc kubenswrapper[4735]: E1001 10:17:46.897224 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.909441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.909477 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.909486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.909513 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:46 crc kubenswrapper[4735]: I1001 10:17:46.909523 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:46Z","lastTransitionTime":"2025-10-01T10:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.012715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.012776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.012786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.012806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.012850 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:47Z","lastTransitionTime":"2025-10-01T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.055170 4735 generic.go:334] "Generic (PLEG): container finished" podID="63cf5011-57e9-44fa-a662-f30391ef1ff8" containerID="33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1" exitCode=0 Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.055240 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" event={"ID":"63cf5011-57e9-44fa-a662-f30391ef1ff8","Type":"ContainerDied","Data":"33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1"} Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.067617 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.081799 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.094458 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.112227 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.115053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.115082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.115091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.115105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.115114 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:47Z","lastTransitionTime":"2025-10-01T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.124897 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.136556 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.149410 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.161410 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.180986 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.195312 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.209464 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.217730 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.217794 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.217808 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.217824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.217836 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:47Z","lastTransitionTime":"2025-10-01T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.221621 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.233244 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.249340 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:47Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.321068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.321113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.321123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.321139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.321149 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:47Z","lastTransitionTime":"2025-10-01T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.423982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.424022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.424035 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.424048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.424057 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:47Z","lastTransitionTime":"2025-10-01T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.526551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.526589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.526601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.526617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.526630 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:47Z","lastTransitionTime":"2025-10-01T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.629520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.629600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.629613 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.629632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.629644 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:47Z","lastTransitionTime":"2025-10-01T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.731423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.731457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.731468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.731484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.731518 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:47Z","lastTransitionTime":"2025-10-01T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.834555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.834583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.834594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.834606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.834614 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:47Z","lastTransitionTime":"2025-10-01T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.937255 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.937283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.937291 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.937305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:47 crc kubenswrapper[4735]: I1001 10:17:47.937314 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:47Z","lastTransitionTime":"2025-10-01T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.039552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.039592 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.039601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.039615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.039627 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:48Z","lastTransitionTime":"2025-10-01T10:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.060146 4735 generic.go:334] "Generic (PLEG): container finished" podID="63cf5011-57e9-44fa-a662-f30391ef1ff8" containerID="2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c" exitCode=0 Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.060209 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" event={"ID":"63cf5011-57e9-44fa-a662-f30391ef1ff8","Type":"ContainerDied","Data":"2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.063903 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.074313 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.087970 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.101682 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.112751 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.122394 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.141717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.141780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.141796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.141821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.141840 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:48Z","lastTransitionTime":"2025-10-01T10:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.143085 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.156551 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.171918 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.187250 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.200695 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.209912 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.221982 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.232566 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.244385 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:48Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.245170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.245204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.245214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.245259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.245272 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:48Z","lastTransitionTime":"2025-10-01T10:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.347430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.347469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.347480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.347512 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.347525 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:48Z","lastTransitionTime":"2025-10-01T10:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.450308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.450376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.450395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.450418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.450436 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:48Z","lastTransitionTime":"2025-10-01T10:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.552186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.552228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.552236 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.552252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.552263 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:48Z","lastTransitionTime":"2025-10-01T10:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.654345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.654378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.654389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.654404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.654414 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:48Z","lastTransitionTime":"2025-10-01T10:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.719263 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.719355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719366 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:17:56.719346938 +0000 UTC m=+35.412168200 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.719395 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.719444 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719472 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719486 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719528 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719534 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.719472 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719568 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:56.719557273 +0000 UTC m=+35.412378535 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719586 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:56.719579144 +0000 UTC m=+35.412400406 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719608 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719625 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719628 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719723 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:56.719704057 +0000 UTC m=+35.412525319 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719638 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.719793 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 10:17:56.719782559 +0000 UTC m=+35.412603881 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.756988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.757044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.757056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.757075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.757087 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:48Z","lastTransitionTime":"2025-10-01T10:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.859228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.859277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.859289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.859305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.859320 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:48Z","lastTransitionTime":"2025-10-01T10:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.896780 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.896797 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.897005 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.896816 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.897266 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:17:48 crc kubenswrapper[4735]: E1001 10:17:48.897135 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.961222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.961258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.961267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.961282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:48 crc kubenswrapper[4735]: I1001 10:17:48.961292 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:48Z","lastTransitionTime":"2025-10-01T10:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.064069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.064660 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.064697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.064717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.064732 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:49Z","lastTransitionTime":"2025-10-01T10:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.067975 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" event={"ID":"63cf5011-57e9-44fa-a662-f30391ef1ff8","Type":"ContainerStarted","Data":"364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481"} Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.078074 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.094446 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.108161 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.117384 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.130353 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.141075 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.153116 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.167345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.167381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.167390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.167404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.167414 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:49Z","lastTransitionTime":"2025-10-01T10:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.168361 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.181235 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.194477 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.208902 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.220191 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.232545 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.247531 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:49Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.270348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.270595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.270711 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.270951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.271197 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:49Z","lastTransitionTime":"2025-10-01T10:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.373481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.373544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.373556 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.373572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.373581 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:49Z","lastTransitionTime":"2025-10-01T10:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.475661 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.475703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.475715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.475732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.475743 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:49Z","lastTransitionTime":"2025-10-01T10:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.577834 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.577877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.577888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.577901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.577910 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:49Z","lastTransitionTime":"2025-10-01T10:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.679969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.679999 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.680010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.680026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.680038 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:49Z","lastTransitionTime":"2025-10-01T10:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.781936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.781979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.781992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.782011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.782024 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:49Z","lastTransitionTime":"2025-10-01T10:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.885115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.885164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.885178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.885194 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.885224 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:49Z","lastTransitionTime":"2025-10-01T10:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.987877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.987905 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.987915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.987929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:49 crc kubenswrapper[4735]: I1001 10:17:49.987940 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:49Z","lastTransitionTime":"2025-10-01T10:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.076357 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.076688 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.080017 4735 generic.go:334] "Generic (PLEG): container finished" podID="63cf5011-57e9-44fa-a662-f30391ef1ff8" containerID="364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481" exitCode=0 Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.080084 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" event={"ID":"63cf5011-57e9-44fa-a662-f30391ef1ff8","Type":"ContainerDied","Data":"364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.086611 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.091300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.091536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.091554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.091569 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.091582 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.096628 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.107806 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.110882 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.127732 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.140887 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.154756 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.171619 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.183380 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.193436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.193508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.193522 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.193539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.193548 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.196476 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.220252 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.234647 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.249071 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.264748 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.281276 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.293054 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.296450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.296528 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.296541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.296565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.296582 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.307778 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.322385 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.350486 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.369965 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.395125 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.400234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.400283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.400297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.400318 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.400332 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.418305 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.431617 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.446580 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.459678 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.473464 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.483264 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.495255 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.503829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.503907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.503928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.503954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.503975 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.510045 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.525580 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.607008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.607058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.607066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.607081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.607093 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.709446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.709513 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.709525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.709539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.709551 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.811994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.812025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.812035 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.812047 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.812055 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.889900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.889946 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.889956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.889972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.889982 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.896473 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.896567 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:50 crc kubenswrapper[4735]: E1001 10:17:50.896609 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.896625 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:50 crc kubenswrapper[4735]: E1001 10:17:50.896703 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:17:50 crc kubenswrapper[4735]: E1001 10:17:50.896816 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:17:50 crc kubenswrapper[4735]: E1001 10:17:50.902594 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.906652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.906686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.906700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.906719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.906736 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: E1001 10:17:50.919641 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.923876 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.923925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.923935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.923957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.923968 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: E1001 10:17:50.936817 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.941744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.941805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.941821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.941846 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.941863 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: E1001 10:17:50.955470 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.964446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.964483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.964491 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.964519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.964530 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:50 crc kubenswrapper[4735]: E1001 10:17:50.977614 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:50Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:50 crc kubenswrapper[4735]: E1001 10:17:50.977736 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.979657 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.979691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.979701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.979717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:50 crc kubenswrapper[4735]: I1001 10:17:50.979730 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:50Z","lastTransitionTime":"2025-10-01T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.082539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.082598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.082614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.082634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.082646 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:51Z","lastTransitionTime":"2025-10-01T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.087118 4735 generic.go:334] "Generic (PLEG): container finished" podID="63cf5011-57e9-44fa-a662-f30391ef1ff8" containerID="3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab" exitCode=0 Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.087200 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" event={"ID":"63cf5011-57e9-44fa-a662-f30391ef1ff8","Type":"ContainerDied","Data":"3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab"} Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.088213 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.106929 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.114863 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.118760 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.138939 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.151891 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.166117 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.177711 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.184933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.184969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.184981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.184997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.185007 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:51Z","lastTransitionTime":"2025-10-01T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.195140 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.207401 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.222726 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.238689 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.252146 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.264884 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.280344 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.287778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.287836 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.287849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.287868 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.287879 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:51Z","lastTransitionTime":"2025-10-01T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.297298 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.308175 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.322914 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.335986 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.350782 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.362395 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.374614 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.389245 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.392041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.392089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.392102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.392119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.392130 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:51Z","lastTransitionTime":"2025-10-01T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.402223 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.419101 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.433103 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.449075 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.468872 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.481778 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.494405 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.494453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.494466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.494520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.494548 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:51Z","lastTransitionTime":"2025-10-01T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.495225 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.597182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.597227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.597240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.597259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.597273 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:51Z","lastTransitionTime":"2025-10-01T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.699335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.699603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.699670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.699740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.699830 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:51Z","lastTransitionTime":"2025-10-01T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.802335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.802380 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.802393 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.802409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.802419 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:51Z","lastTransitionTime":"2025-10-01T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.904756 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.904786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.904795 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.904811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.904820 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:51Z","lastTransitionTime":"2025-10-01T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.909083 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.919201 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.935771 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.949196 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.960306 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.978328 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:51 crc kubenswrapper[4735]: I1001 10:17:51.995458 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.006880 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.007251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.007276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.007283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.007326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.007336 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:52Z","lastTransitionTime":"2025-10-01T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.019254 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.034176 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.045602 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.060443 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.071684 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.081573 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.093560 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" event={"ID":"63cf5011-57e9-44fa-a662-f30391ef1ff8","Type":"ContainerStarted","Data":"98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51"} Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.105912 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.109422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.109447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.109456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.109469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.109478 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:52Z","lastTransitionTime":"2025-10-01T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.116316 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.133907 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.145525 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.156750 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.166392 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.177088 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.187850 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.196410 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.207644 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.210967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.210999 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.211007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.211020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.211028 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:52Z","lastTransitionTime":"2025-10-01T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.218885 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.231091 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.242944 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.254271 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.313168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.313206 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.313217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.313233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.313243 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:52Z","lastTransitionTime":"2025-10-01T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.415940 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.415978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.415989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.416004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.416013 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:52Z","lastTransitionTime":"2025-10-01T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.479687 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.492690 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.506182 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.518160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.518209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.518222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.518240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.518252 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:52Z","lastTransitionTime":"2025-10-01T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.519209 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.531985 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.541733 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.554114 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.566420 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.576245 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.590479 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.601843 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.616999 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.619966 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.619997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.620007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.620020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.620029 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:52Z","lastTransitionTime":"2025-10-01T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.627833 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.637305 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.655039 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.722777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.722815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.722827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.722842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.722853 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:52Z","lastTransitionTime":"2025-10-01T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.825286 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.825325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.825334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.825348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.825361 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:52Z","lastTransitionTime":"2025-10-01T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.896098 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.896183 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.896094 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:52 crc kubenswrapper[4735]: E1001 10:17:52.896326 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:17:52 crc kubenswrapper[4735]: E1001 10:17:52.896239 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:17:52 crc kubenswrapper[4735]: E1001 10:17:52.896389 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.927853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.927885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.927897 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.927913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:52 crc kubenswrapper[4735]: I1001 10:17:52.927925 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:52Z","lastTransitionTime":"2025-10-01T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.030281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.030321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.030330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.030344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.030354 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:53Z","lastTransitionTime":"2025-10-01T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.097652 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/0.log" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.099887 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b" exitCode=1 Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.100009 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b"} Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.101156 4735 scope.go:117] "RemoveContainer" containerID="31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.116369 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.128513 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.132411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.132737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.132758 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.132773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.132782 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:53Z","lastTransitionTime":"2025-10-01T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.142963 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.157255 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.169706 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.191231 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"message\\\":\\\"5 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 10:17:52.086989 6025 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 10:17:52.087002 6025 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 10:17:52.087014 6025 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 10:17:52.087023 6025 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 10:17:52.087040 6025 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 10:17:52.087330 6025 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 10:17:52.087340 6025 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 10:17:52.087345 6025 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 10:17:52.087363 6025 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 10:17:52.087368 6025 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 10:17:52.087378 6025 factory.go:656] Stopping watch factory\\\\nI1001 10:17:52.087387 6025 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.202642 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.216076 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.226054 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.236413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.236439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.236447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.236459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.236468 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:53Z","lastTransitionTime":"2025-10-01T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.238473 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.250879 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.263764 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.279081 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.292060 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:53Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.338749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.338785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.338796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.338812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.338839 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:53Z","lastTransitionTime":"2025-10-01T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.441148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.441179 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.441188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.441202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.441212 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:53Z","lastTransitionTime":"2025-10-01T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.543326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.543365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.543376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.543392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.543404 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:53Z","lastTransitionTime":"2025-10-01T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.645149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.645183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.645191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.645207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.645216 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:53Z","lastTransitionTime":"2025-10-01T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.748238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.748287 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.748305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.748323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.748335 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:53Z","lastTransitionTime":"2025-10-01T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.851153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.851191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.851202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.851218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.851228 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:53Z","lastTransitionTime":"2025-10-01T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.953423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.953468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.953480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.953515 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:53 crc kubenswrapper[4735]: I1001 10:17:53.953528 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:53Z","lastTransitionTime":"2025-10-01T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.055677 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.055730 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.055742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.055758 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.055769 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:54Z","lastTransitionTime":"2025-10-01T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.104625 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/1.log" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.105843 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/0.log" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.109448 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f" exitCode=1 Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.109563 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f"} Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.109706 4735 scope.go:117] "RemoveContainer" containerID="31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.111560 4735 scope.go:117] "RemoveContainer" containerID="4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f" Oct 01 10:17:54 crc kubenswrapper[4735]: E1001 10:17:54.111893 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.129440 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.144194 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.158544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.158588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.158598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.158615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.158625 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:54Z","lastTransitionTime":"2025-10-01T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.160631 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.163998 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj"] Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.164467 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.165966 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.166303 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.172820 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.173943 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.173983 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.174009 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2nqt\" (UniqueName: \"kubernetes.io/projected/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-kube-api-access-l2nqt\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.174194 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.183396 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.204234 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"message\\\":\\\"5 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 10:17:52.086989 6025 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 10:17:52.087002 6025 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 10:17:52.087014 6025 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 10:17:52.087023 6025 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 10:17:52.087040 6025 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 10:17:52.087330 6025 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 10:17:52.087340 6025 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 10:17:52.087345 6025 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 10:17:52.087363 6025 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 10:17:52.087368 6025 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 10:17:52.087378 6025 factory.go:656] Stopping watch factory\\\\nI1001 10:17:52.087387 6025 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:17:53Z\\\",\\\"message\\\":\\\"hook for network=default\\\\nI1001 10:17:53.802247 6187 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:17:53.802243 6187 obj_retry.go:409] Going to retry *v1.Pod resource setup for 12 objects: [openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-k5mgz openshift-image-registry/node-ca-zdmp4 openshift-machine-config-operator/machine-config-daemon-xgg24 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-multus/multus-additional-cni-plugins-6qlsd openshift-dns/node-resolver-5n9cx openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-8dz9b openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-network-diagnostics/network-check-source-55646444c4-trplf]\\\\nI1001 10:17:53.802329 6187 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:17:53.802112 6187 services_controller.go:445] Built service openshift-kube-apiserver/apiserver LB template configs for network=default: []services.lbConfig(nil)\\\\nF1001 10:17:53.802435 6187 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.217359 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.230477 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.245593 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.257326 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.261437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.261487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.261518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.261537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.261551 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:54Z","lastTransitionTime":"2025-10-01T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.270100 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.275438 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.275549 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.275585 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2nqt\" (UniqueName: \"kubernetes.io/projected/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-kube-api-access-l2nqt\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.275646 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.276259 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.276459 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.283674 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.284831 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.292370 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2nqt\" (UniqueName: \"kubernetes.io/projected/358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6-kube-api-access-l2nqt\") pod \"ovnkube-control-plane-749d76644c-zcvzj\" (UID: \"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.295227 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.308784 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.322361 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.333383 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.352605 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31d15a6854c302bea8840dd226b3d3bcef747aad242a1f817ac0651e90feac4b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"message\\\":\\\"5 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 10:17:52.086989 6025 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 10:17:52.087002 6025 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 10:17:52.087014 6025 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 10:17:52.087023 6025 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 10:17:52.087040 6025 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 10:17:52.087330 6025 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 10:17:52.087340 6025 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 10:17:52.087345 6025 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 10:17:52.087363 6025 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 10:17:52.087368 6025 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 10:17:52.087378 6025 factory.go:656] Stopping watch factory\\\\nI1001 10:17:52.087387 6025 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:17:53Z\\\",\\\"message\\\":\\\"hook for network=default\\\\nI1001 10:17:53.802247 6187 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:17:53.802243 6187 obj_retry.go:409] Going to retry *v1.Pod resource setup for 12 objects: [openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-k5mgz openshift-image-registry/node-ca-zdmp4 openshift-machine-config-operator/machine-config-daemon-xgg24 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-multus/multus-additional-cni-plugins-6qlsd openshift-dns/node-resolver-5n9cx openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-8dz9b openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-network-diagnostics/network-check-source-55646444c4-trplf]\\\\nI1001 10:17:53.802329 6187 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:17:53.802112 6187 services_controller.go:445] Built service openshift-kube-apiserver/apiserver LB template configs for network=default: []services.lbConfig(nil)\\\\nF1001 10:17:53.802435 6187 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.364513 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.364570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.364582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.364601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.364617 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:54Z","lastTransitionTime":"2025-10-01T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.367182 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.379679 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.392239 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.402899 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.417173 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.426937 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.440249 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.453447 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.464082 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.466947 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.466979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.466990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.467006 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.467016 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:54Z","lastTransitionTime":"2025-10-01T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.476959 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.477268 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" Oct 01 10:17:54 crc kubenswrapper[4735]: W1001 10:17:54.491841 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod358434f3_ecfd_4a3e_85d4_0a5d3d12d4d6.slice/crio-29f5f0c18006e6a63f65076f6392a61f47277c9ccb7a520d6bd9298536edc4cf WatchSource:0}: Error finding container 29f5f0c18006e6a63f65076f6392a61f47277c9ccb7a520d6bd9298536edc4cf: Status 404 returned error can't find the container with id 29f5f0c18006e6a63f65076f6392a61f47277c9ccb7a520d6bd9298536edc4cf Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.494573 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.511371 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:54Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.569890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.569949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.569960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.569982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.569993 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:54Z","lastTransitionTime":"2025-10-01T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.672935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.672980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.672992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.673014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.673026 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:54Z","lastTransitionTime":"2025-10-01T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.776815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.776860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.776872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.776893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.776907 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:54Z","lastTransitionTime":"2025-10-01T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.879333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.879381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.879392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.879410 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.879422 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:54Z","lastTransitionTime":"2025-10-01T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.896003 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.896073 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.896093 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:54 crc kubenswrapper[4735]: E1001 10:17:54.896134 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:17:54 crc kubenswrapper[4735]: E1001 10:17:54.896207 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:17:54 crc kubenswrapper[4735]: E1001 10:17:54.896313 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.981711 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.981750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.981760 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.981775 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:54 crc kubenswrapper[4735]: I1001 10:17:54.981786 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:54Z","lastTransitionTime":"2025-10-01T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.083971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.084006 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.084014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.084028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.084037 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:55Z","lastTransitionTime":"2025-10-01T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.114430 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/1.log" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.118487 4735 scope.go:117] "RemoveContainer" containerID="4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f" Oct 01 10:17:55 crc kubenswrapper[4735]: E1001 10:17:55.118767 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.120725 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" event={"ID":"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6","Type":"ContainerStarted","Data":"cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.120763 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" event={"ID":"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6","Type":"ContainerStarted","Data":"a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.120778 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" event={"ID":"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6","Type":"ContainerStarted","Data":"29f5f0c18006e6a63f65076f6392a61f47277c9ccb7a520d6bd9298536edc4cf"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.132257 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.143068 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.158825 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.169652 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.178778 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.186523 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.186561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.186574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.186591 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.186603 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:55Z","lastTransitionTime":"2025-10-01T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.195327 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.208545 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.217435 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.228173 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.242664 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.259330 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.269456 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.289157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.289198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.289207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.289223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.289232 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:55Z","lastTransitionTime":"2025-10-01T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.292945 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:17:53Z\\\",\\\"message\\\":\\\"hook for network=default\\\\nI1001 10:17:53.802247 6187 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:17:53.802243 6187 obj_retry.go:409] Going to retry *v1.Pod resource setup for 12 objects: [openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-k5mgz openshift-image-registry/node-ca-zdmp4 openshift-machine-config-operator/machine-config-daemon-xgg24 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-multus/multus-additional-cni-plugins-6qlsd openshift-dns/node-resolver-5n9cx openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-8dz9b openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-network-diagnostics/network-check-source-55646444c4-trplf]\\\\nI1001 10:17:53.802329 6187 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:17:53.802112 6187 services_controller.go:445] Built service openshift-kube-apiserver/apiserver LB template configs for network=default: []services.lbConfig(nil)\\\\nF1001 10:17:53.802435 6187 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.294583 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-qm6mr"] Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.295391 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:17:55 crc kubenswrapper[4735]: E1001 10:17:55.295537 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.303727 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.315020 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.324379 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.335619 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.346099 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.356264 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.367649 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.384879 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.388416 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.388458 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sffkg\" (UniqueName: \"kubernetes.io/projected/77b56a8b-1a27-4727-b45e-43fbc3847ddd-kube-api-access-sffkg\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.391878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.391977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.391989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.392004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.392014 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:55Z","lastTransitionTime":"2025-10-01T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.406150 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.422288 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.439524 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.450655 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.463605 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.475598 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.485455 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.489187 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.489225 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sffkg\" (UniqueName: \"kubernetes.io/projected/77b56a8b-1a27-4727-b45e-43fbc3847ddd-kube-api-access-sffkg\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:17:55 crc kubenswrapper[4735]: E1001 10:17:55.489429 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:17:55 crc kubenswrapper[4735]: E1001 10:17:55.489653 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs podName:77b56a8b-1a27-4727-b45e-43fbc3847ddd nodeName:}" failed. No retries permitted until 2025-10-01 10:17:55.989619277 +0000 UTC m=+34.682440749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs") pod "network-metrics-daemon-qm6mr" (UID: "77b56a8b-1a27-4727-b45e-43fbc3847ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.494091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.494134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.494146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.494163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.494175 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:55Z","lastTransitionTime":"2025-10-01T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.503815 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:17:53Z\\\",\\\"message\\\":\\\"hook for network=default\\\\nI1001 10:17:53.802247 6187 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:17:53.802243 6187 obj_retry.go:409] Going to retry *v1.Pod resource setup for 12 objects: [openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-k5mgz openshift-image-registry/node-ca-zdmp4 openshift-machine-config-operator/machine-config-daemon-xgg24 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-multus/multus-additional-cni-plugins-6qlsd openshift-dns/node-resolver-5n9cx openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-8dz9b openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-network-diagnostics/network-check-source-55646444c4-trplf]\\\\nI1001 10:17:53.802329 6187 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:17:53.802112 6187 services_controller.go:445] Built service openshift-kube-apiserver/apiserver LB template configs for network=default: []services.lbConfig(nil)\\\\nF1001 10:17:53.802435 6187 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.506481 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sffkg\" (UniqueName: \"kubernetes.io/projected/77b56a8b-1a27-4727-b45e-43fbc3847ddd-kube-api-access-sffkg\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.516333 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.527601 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:17:55Z is after 2025-08-24T17:21:41Z" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.596401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.596726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.596827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.596915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.597003 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:55Z","lastTransitionTime":"2025-10-01T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.699740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.699775 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.699784 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.699799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.699808 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:55Z","lastTransitionTime":"2025-10-01T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.801893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.801941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.801951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.801966 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.801977 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:55Z","lastTransitionTime":"2025-10-01T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.903246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.903278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.903286 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.903297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.903305 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:55Z","lastTransitionTime":"2025-10-01T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:55 crc kubenswrapper[4735]: I1001 10:17:55.994037 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:17:55 crc kubenswrapper[4735]: E1001 10:17:55.994170 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:17:55 crc kubenswrapper[4735]: E1001 10:17:55.994232 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs podName:77b56a8b-1a27-4727-b45e-43fbc3847ddd nodeName:}" failed. No retries permitted until 2025-10-01 10:17:56.994215019 +0000 UTC m=+35.687036281 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs") pod "network-metrics-daemon-qm6mr" (UID: "77b56a8b-1a27-4727-b45e-43fbc3847ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.005867 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.005920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.005932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.005957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.005970 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:56Z","lastTransitionTime":"2025-10-01T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.108214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.108458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.108472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.108487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.108518 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:56Z","lastTransitionTime":"2025-10-01T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.210319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.210365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.210373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.210388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.210398 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:56Z","lastTransitionTime":"2025-10-01T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.312175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.312205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.312214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.312226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.312235 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:56Z","lastTransitionTime":"2025-10-01T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.415061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.415113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.415130 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.415151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.415167 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:56Z","lastTransitionTime":"2025-10-01T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.517922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.517971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.517980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.517997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.518007 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:56Z","lastTransitionTime":"2025-10-01T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.620828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.620866 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.620874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.620887 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.620896 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:56Z","lastTransitionTime":"2025-10-01T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.723012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.723062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.723073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.723092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.723105 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:56Z","lastTransitionTime":"2025-10-01T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.801813 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.801885 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.801906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.801943 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.801964 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802037 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802058 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:18:12.802027517 +0000 UTC m=+51.494848789 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802103 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:18:12.802092769 +0000 UTC m=+51.494914041 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802107 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802127 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802139 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802174 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 10:18:12.802159851 +0000 UTC m=+51.494981113 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802178 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802270 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802315 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802340 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802293 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:18:12.802264144 +0000 UTC m=+51.495085446 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.802441 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 10:18:12.802403928 +0000 UTC m=+51.495225210 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.825770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.825824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.825840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.825867 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.825884 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:56Z","lastTransitionTime":"2025-10-01T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.896991 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.897061 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.897102 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.897127 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.898034 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.898205 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.898433 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:17:56 crc kubenswrapper[4735]: E1001 10:17:56.898536 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.927563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.927618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.927636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.927657 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:56 crc kubenswrapper[4735]: I1001 10:17:56.927672 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:56Z","lastTransitionTime":"2025-10-01T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.004211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:17:57 crc kubenswrapper[4735]: E1001 10:17:57.004388 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:17:57 crc kubenswrapper[4735]: E1001 10:17:57.004450 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs podName:77b56a8b-1a27-4727-b45e-43fbc3847ddd nodeName:}" failed. No retries permitted until 2025-10-01 10:17:59.004433019 +0000 UTC m=+37.697254291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs") pod "network-metrics-daemon-qm6mr" (UID: "77b56a8b-1a27-4727-b45e-43fbc3847ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.030323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.030411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.030486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.030537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.030559 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:57Z","lastTransitionTime":"2025-10-01T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.132190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.132252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.132266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.132283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.132294 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:57Z","lastTransitionTime":"2025-10-01T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.235421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.235597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.235621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.235653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.235681 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:57Z","lastTransitionTime":"2025-10-01T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.338210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.338255 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.338269 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.338290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.338304 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:57Z","lastTransitionTime":"2025-10-01T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.440459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.440521 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.440537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.440551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.440562 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:57Z","lastTransitionTime":"2025-10-01T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.543579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.543618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.543626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.543641 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.543651 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:57Z","lastTransitionTime":"2025-10-01T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.645260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.645782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.645809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.645837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.645856 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:57Z","lastTransitionTime":"2025-10-01T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.748573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.748642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.748659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.748683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.748702 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:57Z","lastTransitionTime":"2025-10-01T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.851686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.851756 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.851779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.851810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.851833 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:57Z","lastTransitionTime":"2025-10-01T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.954600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.954659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.954673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.954694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:57 crc kubenswrapper[4735]: I1001 10:17:57.954709 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:57Z","lastTransitionTime":"2025-10-01T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.058139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.058205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.058220 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.058240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.058254 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:58Z","lastTransitionTime":"2025-10-01T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.160640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.160681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.160695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.160712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.160726 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:58Z","lastTransitionTime":"2025-10-01T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.263594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.263655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.263673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.263696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.263711 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:58Z","lastTransitionTime":"2025-10-01T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.366068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.366098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.366105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.366119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.366127 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:58Z","lastTransitionTime":"2025-10-01T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.468590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.468656 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.468663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.468678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.468687 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:58Z","lastTransitionTime":"2025-10-01T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.571934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.572007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.572019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.572036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.572047 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:58Z","lastTransitionTime":"2025-10-01T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.674339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.674378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.674389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.674404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.674416 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:58Z","lastTransitionTime":"2025-10-01T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.777003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.777048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.777061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.777079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.777090 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:58Z","lastTransitionTime":"2025-10-01T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.879882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.879936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.879945 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.879959 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.879970 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:58Z","lastTransitionTime":"2025-10-01T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.896246 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.896297 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.896339 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:17:58 crc kubenswrapper[4735]: E1001 10:17:58.896405 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.896264 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:17:58 crc kubenswrapper[4735]: E1001 10:17:58.896519 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:17:58 crc kubenswrapper[4735]: E1001 10:17:58.896589 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:17:58 crc kubenswrapper[4735]: E1001 10:17:58.896638 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.982985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.983011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.983022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.983036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:58 crc kubenswrapper[4735]: I1001 10:17:58.983045 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:58Z","lastTransitionTime":"2025-10-01T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.025531 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:17:59 crc kubenswrapper[4735]: E1001 10:17:59.025666 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:17:59 crc kubenswrapper[4735]: E1001 10:17:59.025731 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs podName:77b56a8b-1a27-4727-b45e-43fbc3847ddd nodeName:}" failed. No retries permitted until 2025-10-01 10:18:03.025711023 +0000 UTC m=+41.718532305 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs") pod "network-metrics-daemon-qm6mr" (UID: "77b56a8b-1a27-4727-b45e-43fbc3847ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.086002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.086084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.086107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.086137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.086159 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:59Z","lastTransitionTime":"2025-10-01T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.188454 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.188518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.188531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.188549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.188560 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:59Z","lastTransitionTime":"2025-10-01T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.291153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.291194 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.291203 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.291225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.291235 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:59Z","lastTransitionTime":"2025-10-01T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.394306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.394396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.394416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.394449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.394536 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:59Z","lastTransitionTime":"2025-10-01T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.496888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.496949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.496961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.496984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.496998 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:59Z","lastTransitionTime":"2025-10-01T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.599673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.599715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.599722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.599734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.599744 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:59Z","lastTransitionTime":"2025-10-01T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.702197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.702278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.702298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.702335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.702365 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:59Z","lastTransitionTime":"2025-10-01T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.805518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.805581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.805598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.805628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.805644 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:59Z","lastTransitionTime":"2025-10-01T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.907882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.907924 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.907935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.907947 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:17:59 crc kubenswrapper[4735]: I1001 10:17:59.907956 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:17:59Z","lastTransitionTime":"2025-10-01T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.014028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.014167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.014359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.014414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.014442 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:00Z","lastTransitionTime":"2025-10-01T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.117574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.117622 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.117637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.117659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.117675 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:00Z","lastTransitionTime":"2025-10-01T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.219833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.219881 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.219894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.219911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.219923 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:00Z","lastTransitionTime":"2025-10-01T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.322119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.322187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.322215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.322241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.322259 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:00Z","lastTransitionTime":"2025-10-01T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.424601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.424666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.424690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.424716 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.424735 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:00Z","lastTransitionTime":"2025-10-01T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.527072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.527124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.527137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.527156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.527170 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:00Z","lastTransitionTime":"2025-10-01T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.629777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.629838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.629856 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.629878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.629896 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:00Z","lastTransitionTime":"2025-10-01T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.732780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.732819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.732830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.732846 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.732860 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:00Z","lastTransitionTime":"2025-10-01T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.834891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.834918 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.834926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.834939 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.834947 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:00Z","lastTransitionTime":"2025-10-01T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.896050 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.896152 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:00 crc kubenswrapper[4735]: E1001 10:18:00.896178 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.896049 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.896067 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:00 crc kubenswrapper[4735]: E1001 10:18:00.896534 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:00 crc kubenswrapper[4735]: E1001 10:18:00.896593 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:00 crc kubenswrapper[4735]: E1001 10:18:00.896398 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.938313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.938365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.938385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.938408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:00 crc kubenswrapper[4735]: I1001 10:18:00.938426 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:00Z","lastTransitionTime":"2025-10-01T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.028695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.028734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.028743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.028758 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.028768 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: E1001 10:18:01.043564 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.047051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.047072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.047080 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.047092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.047103 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: E1001 10:18:01.065475 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.070598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.070636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.070645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.070661 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.070670 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: E1001 10:18:01.088196 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.092281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.092351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.092361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.092376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.092385 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: E1001 10:18:01.104353 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.108292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.108357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.108382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.108411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.108433 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: E1001 10:18:01.125282 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: E1001 10:18:01.125434 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.127624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.127673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.127692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.127712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.127730 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.230721 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.230788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.230812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.230840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.230859 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.334103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.334160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.334184 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.334210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.334231 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.437084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.437128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.437139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.437156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.437168 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.539304 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.539346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.539356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.539370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.539382 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.641383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.641419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.641435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.641449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.641461 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.744192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.744581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.744717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.744841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.744993 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.847895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.847968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.848004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.848031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.848054 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.909987 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.923967 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.934249 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.944891 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.950041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.950082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.950091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.950107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.950116 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:01Z","lastTransitionTime":"2025-10-01T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.955270 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.965137 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.979266 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:01 crc kubenswrapper[4735]: I1001 10:18:01.990847 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.005989 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.015858 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.027739 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.037760 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.048390 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.051807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.051845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.051856 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.051873 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.051885 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:02Z","lastTransitionTime":"2025-10-01T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.058914 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.084085 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:17:53Z\\\",\\\"message\\\":\\\"hook for network=default\\\\nI1001 10:17:53.802247 6187 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:17:53.802243 6187 obj_retry.go:409] Going to retry *v1.Pod resource setup for 12 objects: [openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-k5mgz openshift-image-registry/node-ca-zdmp4 openshift-machine-config-operator/machine-config-daemon-xgg24 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-multus/multus-additional-cni-plugins-6qlsd openshift-dns/node-resolver-5n9cx openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-8dz9b openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-network-diagnostics/network-check-source-55646444c4-trplf]\\\\nI1001 10:17:53.802329 6187 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:17:53.802112 6187 services_controller.go:445] Built service openshift-kube-apiserver/apiserver LB template configs for network=default: []services.lbConfig(nil)\\\\nF1001 10:17:53.802435 6187 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.097690 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.154158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.154207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.154220 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.154238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.154253 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:02Z","lastTransitionTime":"2025-10-01T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.256137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.256168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.256177 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.256189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.256197 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:02Z","lastTransitionTime":"2025-10-01T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.358628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.358660 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.358669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.358681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.358690 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:02Z","lastTransitionTime":"2025-10-01T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.461408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.461474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.461485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.461532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.461545 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:02Z","lastTransitionTime":"2025-10-01T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.563617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.563669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.563698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.563712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.563723 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:02Z","lastTransitionTime":"2025-10-01T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.666336 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.666402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.666426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.666455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.666478 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:02Z","lastTransitionTime":"2025-10-01T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.768578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.768627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.768642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.768657 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.768668 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:02Z","lastTransitionTime":"2025-10-01T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.871860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.871929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.871952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.871981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.872002 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:02Z","lastTransitionTime":"2025-10-01T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.896347 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.896377 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.896397 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:02 crc kubenswrapper[4735]: E1001 10:18:02.896461 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.896347 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:02 crc kubenswrapper[4735]: E1001 10:18:02.896561 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:02 crc kubenswrapper[4735]: E1001 10:18:02.896647 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:02 crc kubenswrapper[4735]: E1001 10:18:02.896715 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.974183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.974217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.974228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.974244 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:02 crc kubenswrapper[4735]: I1001 10:18:02.974268 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:02Z","lastTransitionTime":"2025-10-01T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.067383 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:03 crc kubenswrapper[4735]: E1001 10:18:03.067597 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:18:03 crc kubenswrapper[4735]: E1001 10:18:03.067694 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs podName:77b56a8b-1a27-4727-b45e-43fbc3847ddd nodeName:}" failed. No retries permitted until 2025-10-01 10:18:11.067674834 +0000 UTC m=+49.760496096 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs") pod "network-metrics-daemon-qm6mr" (UID: "77b56a8b-1a27-4727-b45e-43fbc3847ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.076313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.076367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.076381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.076399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.076410 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:03Z","lastTransitionTime":"2025-10-01T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.178975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.179033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.179043 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.179058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.179067 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:03Z","lastTransitionTime":"2025-10-01T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.280939 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.281236 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.281247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.281262 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.281271 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:03Z","lastTransitionTime":"2025-10-01T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.384128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.384177 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.384189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.384209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.384221 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:03Z","lastTransitionTime":"2025-10-01T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.486364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.486414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.486426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.486446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.486457 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:03Z","lastTransitionTime":"2025-10-01T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.588330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.588370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.588378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.588392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.588403 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:03Z","lastTransitionTime":"2025-10-01T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.690770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.690798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.690808 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.690822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.690833 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:03Z","lastTransitionTime":"2025-10-01T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.793519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.793548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.793557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.793570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.793578 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:03Z","lastTransitionTime":"2025-10-01T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.896120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.896388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.896474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.896590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.896717 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:03Z","lastTransitionTime":"2025-10-01T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.999528 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.999586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.999602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.999620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:03 crc kubenswrapper[4735]: I1001 10:18:03.999634 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:03Z","lastTransitionTime":"2025-10-01T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.101926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.101970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.101980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.102038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.102058 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:04Z","lastTransitionTime":"2025-10-01T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.204775 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.204815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.204824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.204841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.204851 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:04Z","lastTransitionTime":"2025-10-01T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.308262 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.308303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.308320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.308342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.308369 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:04Z","lastTransitionTime":"2025-10-01T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.411075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.411168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.411450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.411465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.411474 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:04Z","lastTransitionTime":"2025-10-01T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.513610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.513651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.513660 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.513673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.513683 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:04Z","lastTransitionTime":"2025-10-01T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.617911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.617984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.617997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.618019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.618046 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:04Z","lastTransitionTime":"2025-10-01T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.721220 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.721277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.721290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.721306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.721318 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:04Z","lastTransitionTime":"2025-10-01T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.823796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.823839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.823847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.823861 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.823874 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:04Z","lastTransitionTime":"2025-10-01T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.895906 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.895955 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:04 crc kubenswrapper[4735]: E1001 10:18:04.896021 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.896064 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.896102 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:04 crc kubenswrapper[4735]: E1001 10:18:04.896220 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:04 crc kubenswrapper[4735]: E1001 10:18:04.896335 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:04 crc kubenswrapper[4735]: E1001 10:18:04.896687 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.926760 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.926810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.926822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.926837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:04 crc kubenswrapper[4735]: I1001 10:18:04.926855 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:04Z","lastTransitionTime":"2025-10-01T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.029937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.029978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.029987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.030002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.030012 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:05Z","lastTransitionTime":"2025-10-01T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.132717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.132769 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.132779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.132798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.132809 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:05Z","lastTransitionTime":"2025-10-01T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.236877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.236933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.236944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.236963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.236974 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:05Z","lastTransitionTime":"2025-10-01T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.339566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.339603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.339615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.339631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.339643 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:05Z","lastTransitionTime":"2025-10-01T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.443140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.443207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.443222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.443248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.443268 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:05Z","lastTransitionTime":"2025-10-01T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.546539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.546614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.546632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.546655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.546677 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:05Z","lastTransitionTime":"2025-10-01T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.650218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.650287 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.650301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.650353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.650370 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:05Z","lastTransitionTime":"2025-10-01T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.753330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.753371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.753382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.753398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.753409 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:05Z","lastTransitionTime":"2025-10-01T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.855985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.856024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.856036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.856052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.856064 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:05Z","lastTransitionTime":"2025-10-01T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.958710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.958771 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.958783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.958801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:05 crc kubenswrapper[4735]: I1001 10:18:05.958812 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:05Z","lastTransitionTime":"2025-10-01T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.061176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.061208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.061217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.061231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.061239 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:06Z","lastTransitionTime":"2025-10-01T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.163272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.163313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.163323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.163335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.163344 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:06Z","lastTransitionTime":"2025-10-01T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.265523 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.265558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.265566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.265579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.265588 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:06Z","lastTransitionTime":"2025-10-01T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.367287 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.367328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.367341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.367356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.367369 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:06Z","lastTransitionTime":"2025-10-01T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.469469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.469530 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.469541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.469556 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.469569 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:06Z","lastTransitionTime":"2025-10-01T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.572245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.572276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.572283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.572297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.572306 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:06Z","lastTransitionTime":"2025-10-01T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.673962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.674000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.674009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.674023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.674030 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:06Z","lastTransitionTime":"2025-10-01T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.776126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.776178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.776189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.776204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.776213 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:06Z","lastTransitionTime":"2025-10-01T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.877914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.877944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.877951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.877963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.877971 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:06Z","lastTransitionTime":"2025-10-01T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.896610 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.896662 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.896685 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.896630 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:06 crc kubenswrapper[4735]: E1001 10:18:06.896746 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:06 crc kubenswrapper[4735]: E1001 10:18:06.896799 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:06 crc kubenswrapper[4735]: E1001 10:18:06.896910 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:06 crc kubenswrapper[4735]: E1001 10:18:06.897115 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.897699 4735 scope.go:117] "RemoveContainer" containerID="4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.980390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.980426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.980435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.980448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:06 crc kubenswrapper[4735]: I1001 10:18:06.980458 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:06Z","lastTransitionTime":"2025-10-01T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.082400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.082437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.082445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.082459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.082470 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:07Z","lastTransitionTime":"2025-10-01T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.160100 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/1.log" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.164150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274"} Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.164536 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.177931 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.185262 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.185302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.185313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.185328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.185341 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:07Z","lastTransitionTime":"2025-10-01T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.190779 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.216543 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.231056 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.252470 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.270692 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.286437 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.287422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.287449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.287457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.287471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.287479 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:07Z","lastTransitionTime":"2025-10-01T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.298797 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.313629 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.324622 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.339392 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.354283 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.365279 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.380345 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.390000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.390063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.390079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.390105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.390123 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:07Z","lastTransitionTime":"2025-10-01T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.394409 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.415319 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:17:53Z\\\",\\\"message\\\":\\\"hook for network=default\\\\nI1001 10:17:53.802247 6187 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:17:53.802243 6187 obj_retry.go:409] Going to retry *v1.Pod resource setup for 12 objects: [openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-k5mgz openshift-image-registry/node-ca-zdmp4 openshift-machine-config-operator/machine-config-daemon-xgg24 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-multus/multus-additional-cni-plugins-6qlsd openshift-dns/node-resolver-5n9cx openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-8dz9b openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-network-diagnostics/network-check-source-55646444c4-trplf]\\\\nI1001 10:17:53.802329 6187 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:17:53.802112 6187 services_controller.go:445] Built service openshift-kube-apiserver/apiserver LB template configs for network=default: []services.lbConfig(nil)\\\\nF1001 10:17:53.802435 6187 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.493034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.493096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.493109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.493127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.493141 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:07Z","lastTransitionTime":"2025-10-01T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.596047 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.596084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.596095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.596111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.596123 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:07Z","lastTransitionTime":"2025-10-01T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.698329 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.698373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.698384 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.698398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.698408 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:07Z","lastTransitionTime":"2025-10-01T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.801181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.801228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.801240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.801253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.801263 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:07Z","lastTransitionTime":"2025-10-01T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.903870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.903934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.903951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.903978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:07 crc kubenswrapper[4735]: I1001 10:18:07.903999 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:07Z","lastTransitionTime":"2025-10-01T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.005954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.005997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.006011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.006027 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.006040 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:08Z","lastTransitionTime":"2025-10-01T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.107952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.107984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.107993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.108005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.108013 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:08Z","lastTransitionTime":"2025-10-01T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.169117 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/2.log" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.169967 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/1.log" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.173245 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274" exitCode=1 Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.173297 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274"} Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.173341 4735 scope.go:117] "RemoveContainer" containerID="4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.174255 4735 scope.go:117] "RemoveContainer" containerID="24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274" Oct 01 10:18:08 crc kubenswrapper[4735]: E1001 10:18:08.174607 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.187119 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.204302 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.209791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.209825 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.209837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.209853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.209862 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:08Z","lastTransitionTime":"2025-10-01T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.218170 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.231875 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.244524 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.259520 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.273815 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.286865 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.306133 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.311649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.311696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.311714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.311738 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.311756 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:08Z","lastTransitionTime":"2025-10-01T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.326968 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.347464 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.361135 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.374255 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.384949 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.413846 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.413891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.413901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.413919 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.413931 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:08Z","lastTransitionTime":"2025-10-01T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.414718 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b1f5bbf3b5c3ce561b79914cbee1087f704ec5463ca49131e9975f04a5d285f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:17:53Z\\\",\\\"message\\\":\\\"hook for network=default\\\\nI1001 10:17:53.802247 6187 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:17:53.802243 6187 obj_retry.go:409] Going to retry *v1.Pod resource setup for 12 objects: [openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-k5mgz openshift-image-registry/node-ca-zdmp4 openshift-machine-config-operator/machine-config-daemon-xgg24 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-multus/multus-additional-cni-plugins-6qlsd openshift-dns/node-resolver-5n9cx openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-8dz9b openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-network-diagnostics/network-check-source-55646444c4-trplf]\\\\nI1001 10:17:53.802329 6187 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:17:53.802112 6187 services_controller.go:445] Built service openshift-kube-apiserver/apiserver LB template configs for network=default: []services.lbConfig(nil)\\\\nF1001 10:17:53.802435 6187 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:07Z\\\",\\\"message\\\":\\\"kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-qm6mr]\\\\nF1001 10:18:07.648229 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z]\\\\nI1001 10:18:07.648215 6415 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.425329 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:08Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.516251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.516288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.516295 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.516309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.516319 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:08Z","lastTransitionTime":"2025-10-01T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.618797 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.618841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.618851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.618865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.618876 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:08Z","lastTransitionTime":"2025-10-01T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.721327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.721376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.721386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.721399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.721408 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:08Z","lastTransitionTime":"2025-10-01T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.823469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.823516 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.823525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.823539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.823548 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:08Z","lastTransitionTime":"2025-10-01T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.896853 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.896896 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.896873 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.896853 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:08 crc kubenswrapper[4735]: E1001 10:18:08.897005 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:08 crc kubenswrapper[4735]: E1001 10:18:08.897107 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:08 crc kubenswrapper[4735]: E1001 10:18:08.897170 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:08 crc kubenswrapper[4735]: E1001 10:18:08.897213 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.926079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.926103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.926111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.926122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:08 crc kubenswrapper[4735]: I1001 10:18:08.926130 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:08Z","lastTransitionTime":"2025-10-01T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.028296 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.028339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.028350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.028378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.028392 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:09Z","lastTransitionTime":"2025-10-01T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.130384 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.130424 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.130432 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.130448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.130460 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:09Z","lastTransitionTime":"2025-10-01T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.177363 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/2.log" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.180416 4735 scope.go:117] "RemoveContainer" containerID="24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274" Oct 01 10:18:09 crc kubenswrapper[4735]: E1001 10:18:09.180579 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.195445 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.208347 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.219961 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.232717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.232751 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.232759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.232773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.232783 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:09Z","lastTransitionTime":"2025-10-01T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.236008 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.248021 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.264647 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.275721 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.287879 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.306346 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:07Z\\\",\\\"message\\\":\\\"kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-qm6mr]\\\\nF1001 10:18:07.648229 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z]\\\\nI1001 10:18:07.648215 6415 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.320282 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.332965 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.334152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.334178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.334186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.334198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.334208 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:09Z","lastTransitionTime":"2025-10-01T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.343115 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.355551 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.367238 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.380075 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.391488 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:09Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.436276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.436314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.436325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.436340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.436351 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:09Z","lastTransitionTime":"2025-10-01T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.538241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.538269 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.538277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.538290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.538298 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:09Z","lastTransitionTime":"2025-10-01T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.640636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.640678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.640686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.640700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.640709 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:09Z","lastTransitionTime":"2025-10-01T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.742829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.742864 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.742872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.742886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.742894 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:09Z","lastTransitionTime":"2025-10-01T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.845211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.845239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.845249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.845263 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.845272 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:09Z","lastTransitionTime":"2025-10-01T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.947611 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.947650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.947666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.947681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:09 crc kubenswrapper[4735]: I1001 10:18:09.947692 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:09Z","lastTransitionTime":"2025-10-01T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.050032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.050084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.050093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.050105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.050130 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:10Z","lastTransitionTime":"2025-10-01T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.152046 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.152080 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.152091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.152108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.152120 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:10Z","lastTransitionTime":"2025-10-01T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.254976 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.255018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.255035 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.255053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.255065 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:10Z","lastTransitionTime":"2025-10-01T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.357867 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.357912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.357924 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.357938 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.357948 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:10Z","lastTransitionTime":"2025-10-01T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.461829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.461871 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.461880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.461897 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.461910 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:10Z","lastTransitionTime":"2025-10-01T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.564935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.565018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.565036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.565066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.565085 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:10Z","lastTransitionTime":"2025-10-01T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.667757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.667857 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.667871 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.667892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.667905 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:10Z","lastTransitionTime":"2025-10-01T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.771120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.771161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.771169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.771183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.771193 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:10Z","lastTransitionTime":"2025-10-01T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.873878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.873945 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.873960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.873984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.874000 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:10Z","lastTransitionTime":"2025-10-01T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.896347 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:10 crc kubenswrapper[4735]: E1001 10:18:10.896509 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.896585 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.896636 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.896759 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:10 crc kubenswrapper[4735]: E1001 10:18:10.896867 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:10 crc kubenswrapper[4735]: E1001 10:18:10.897120 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:10 crc kubenswrapper[4735]: E1001 10:18:10.897347 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.977168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.977215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.977231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.977248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:10 crc kubenswrapper[4735]: I1001 10:18:10.977261 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:10Z","lastTransitionTime":"2025-10-01T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.080212 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.080262 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.080274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.080291 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.080308 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.154798 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:11 crc kubenswrapper[4735]: E1001 10:18:11.155050 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:18:11 crc kubenswrapper[4735]: E1001 10:18:11.155149 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs podName:77b56a8b-1a27-4727-b45e-43fbc3847ddd nodeName:}" failed. No retries permitted until 2025-10-01 10:18:27.155126204 +0000 UTC m=+65.847947486 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs") pod "network-metrics-daemon-qm6mr" (UID: "77b56a8b-1a27-4727-b45e-43fbc3847ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.182555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.182595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.182604 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.182618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.182628 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.285599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.285671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.285686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.285712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.285728 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.304892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.304954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.304972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.304996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.305014 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: E1001 10:18:11.326051 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:11Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.330707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.330741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.330749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.330765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.330776 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: E1001 10:18:11.350075 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:11Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.355970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.356033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.356057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.356088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.356110 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: E1001 10:18:11.376309 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:11Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.381537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.381602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.381620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.381786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.381805 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: E1001 10:18:11.398194 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:11Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.403805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.403969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.403995 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.404073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.404103 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: E1001 10:18:11.421054 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:11Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:11 crc kubenswrapper[4735]: E1001 10:18:11.421195 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.423028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.423086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.423104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.423126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.423139 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.526642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.526725 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.526746 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.526773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.526795 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.629388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.629444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.629460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.629484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.629519 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.733171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.733241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.733260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.733290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.733308 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.836240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.836290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.836299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.836313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.836323 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.919996 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:11Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.939325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.939390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.939404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.939424 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.939438 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:11Z","lastTransitionTime":"2025-10-01T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.941531 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:11Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.966417 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:07Z\\\",\\\"message\\\":\\\"kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-qm6mr]\\\\nF1001 10:18:07.648229 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z]\\\\nI1001 10:18:07.648215 6415 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:11Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:11 crc kubenswrapper[4735]: I1001 10:18:11.989595 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:11Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.010132 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.031646 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.042717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.042809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.042833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.042863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.042884 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:12Z","lastTransitionTime":"2025-10-01T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.046764 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.065816 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.080077 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.095045 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.135002 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.146008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.146065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.146079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.146103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.146118 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:12Z","lastTransitionTime":"2025-10-01T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.159447 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.177100 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.191354 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.207394 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.218806 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:12Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.249082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.249124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.249136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.249152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.249164 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:12Z","lastTransitionTime":"2025-10-01T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.352819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.352881 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.352894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.352914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.352929 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:12Z","lastTransitionTime":"2025-10-01T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.456017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.456060 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.456081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.456097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.456108 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:12Z","lastTransitionTime":"2025-10-01T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.560200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.560249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.560258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.560278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.560290 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:12Z","lastTransitionTime":"2025-10-01T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.662898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.662971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.662990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.663021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.663041 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:12Z","lastTransitionTime":"2025-10-01T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.766162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.766211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.766223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.766242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.766257 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:12Z","lastTransitionTime":"2025-10-01T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.869646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.869736 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.869805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.869839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.869863 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:12Z","lastTransitionTime":"2025-10-01T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.875125 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.875258 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.875302 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.875357 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.875393 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.875593 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.875595 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:18:44.875534753 +0000 UTC m=+83.568356085 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.875669 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:18:44.875645966 +0000 UTC m=+83.568467238 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.875668 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.875728 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.875756 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.875819 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.875895 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 10:18:44.875824501 +0000 UTC m=+83.568645923 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.875954 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:18:44.875929154 +0000 UTC m=+83.568750416 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.875954 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.875992 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.876015 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.876105 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 10:18:44.876083068 +0000 UTC m=+83.568904520 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.896152 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.896197 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.896226 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.896181 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.896460 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.896969 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.897043 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:12 crc kubenswrapper[4735]: E1001 10:18:12.897101 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.973107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.973164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.973179 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.973200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:12 crc kubenswrapper[4735]: I1001 10:18:12.973214 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:12Z","lastTransitionTime":"2025-10-01T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.076117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.076160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.076172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.076187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.076198 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:13Z","lastTransitionTime":"2025-10-01T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.178546 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.178590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.178601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.178618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.178627 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:13Z","lastTransitionTime":"2025-10-01T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.280273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.280312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.280321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.280337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.280347 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:13Z","lastTransitionTime":"2025-10-01T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.382258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.382308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.382321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.382340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.382353 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:13Z","lastTransitionTime":"2025-10-01T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.485422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.485474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.485488 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.485526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.485541 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:13Z","lastTransitionTime":"2025-10-01T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.588467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.588543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.588557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.588573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.588584 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:13Z","lastTransitionTime":"2025-10-01T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.691875 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.691958 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.691983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.692025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.692053 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:13Z","lastTransitionTime":"2025-10-01T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.795620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.795691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.795710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.795737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.795756 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:13Z","lastTransitionTime":"2025-10-01T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.899844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.899893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.899906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.899933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:13 crc kubenswrapper[4735]: I1001 10:18:13.899949 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:13Z","lastTransitionTime":"2025-10-01T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.002379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.002417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.002427 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.002443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.002453 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:14Z","lastTransitionTime":"2025-10-01T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.105563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.105608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.105618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.105632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.105677 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:14Z","lastTransitionTime":"2025-10-01T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.207725 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.207814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.207824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.207837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.207847 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:14Z","lastTransitionTime":"2025-10-01T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.311103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.311142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.311152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.311167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.311177 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:14Z","lastTransitionTime":"2025-10-01T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.414383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.414442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.414453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.414469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.414481 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:14Z","lastTransitionTime":"2025-10-01T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.517120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.517189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.517210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.517237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.517255 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:14Z","lastTransitionTime":"2025-10-01T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.620567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.620663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.620681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.620709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.620729 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:14Z","lastTransitionTime":"2025-10-01T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.723566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.723614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.723626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.723645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.723658 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:14Z","lastTransitionTime":"2025-10-01T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.826733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.826845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.826862 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.826891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.826907 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:14Z","lastTransitionTime":"2025-10-01T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.896370 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.896411 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.896380 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:14 crc kubenswrapper[4735]: E1001 10:18:14.896535 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.896684 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:14 crc kubenswrapper[4735]: E1001 10:18:14.896750 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:14 crc kubenswrapper[4735]: E1001 10:18:14.896990 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:14 crc kubenswrapper[4735]: E1001 10:18:14.897078 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.929010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.929064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.929077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.929094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:14 crc kubenswrapper[4735]: I1001 10:18:14.929110 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:14Z","lastTransitionTime":"2025-10-01T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.037372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.037476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.037519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.037543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.037559 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:15Z","lastTransitionTime":"2025-10-01T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.140960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.141033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.141044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.141061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.141076 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:15Z","lastTransitionTime":"2025-10-01T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.243845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.243908 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.243916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.243929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.243937 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:15Z","lastTransitionTime":"2025-10-01T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.318801 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.332986 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.340544 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.347354 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.347404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.347418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.347445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.347466 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:15Z","lastTransitionTime":"2025-10-01T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.356381 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.373844 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.389716 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.409155 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.424974 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.442008 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.451429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.451483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.451510 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.451694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.451709 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:15Z","lastTransitionTime":"2025-10-01T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.456071 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.477023 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:07Z\\\",\\\"message\\\":\\\"kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-qm6mr]\\\\nF1001 10:18:07.648229 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z]\\\\nI1001 10:18:07.648215 6415 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.489337 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.507241 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.522170 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.534047 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.545394 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.554139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.554190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.554207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.554227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.554239 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:15Z","lastTransitionTime":"2025-10-01T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.558299 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.572605 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:15Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.656827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.656916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.656936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.656961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.656979 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:15Z","lastTransitionTime":"2025-10-01T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.760218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.760325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.760340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.760362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.760376 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:15Z","lastTransitionTime":"2025-10-01T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.863674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.863780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.863874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.863903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.863922 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:15Z","lastTransitionTime":"2025-10-01T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.968152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.968258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.968278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.968305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:15 crc kubenswrapper[4735]: I1001 10:18:15.968344 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:15Z","lastTransitionTime":"2025-10-01T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.071260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.071316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.071332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.071356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.071372 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:16Z","lastTransitionTime":"2025-10-01T10:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.173915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.173961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.173972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.173994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.174005 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:16Z","lastTransitionTime":"2025-10-01T10:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.276107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.276151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.276162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.276176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.276185 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:16Z","lastTransitionTime":"2025-10-01T10:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.378620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.378710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.378727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.378752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.378767 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:16Z","lastTransitionTime":"2025-10-01T10:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.481590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.481637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.481645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.481695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.481704 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:16Z","lastTransitionTime":"2025-10-01T10:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.584459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.584522 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.584536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.584552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.584570 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:16Z","lastTransitionTime":"2025-10-01T10:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.686653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.686693 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.686704 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.686720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.686730 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:16Z","lastTransitionTime":"2025-10-01T10:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.789098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.789152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.789165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.789180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.789193 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:16Z","lastTransitionTime":"2025-10-01T10:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.890949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.891013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.891025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.891037 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.891046 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:16Z","lastTransitionTime":"2025-10-01T10:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.896388 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.896448 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:16 crc kubenswrapper[4735]: E1001 10:18:16.896556 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.896567 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.896592 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:16 crc kubenswrapper[4735]: E1001 10:18:16.896639 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:16 crc kubenswrapper[4735]: E1001 10:18:16.896698 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:16 crc kubenswrapper[4735]: E1001 10:18:16.896781 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.993563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.993598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.993607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.993621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:16 crc kubenswrapper[4735]: I1001 10:18:16.993629 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:16Z","lastTransitionTime":"2025-10-01T10:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.095880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.096524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.096543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.096562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.096601 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:17Z","lastTransitionTime":"2025-10-01T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.199292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.199337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.199347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.199362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.199373 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:17Z","lastTransitionTime":"2025-10-01T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.302729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.302789 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.302807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.302833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.302850 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:17Z","lastTransitionTime":"2025-10-01T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.406015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.406071 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.406085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.406104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.406119 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:17Z","lastTransitionTime":"2025-10-01T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.509386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.509417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.509426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.509441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.509450 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:17Z","lastTransitionTime":"2025-10-01T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.612647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.612698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.612710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.612727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.612740 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:17Z","lastTransitionTime":"2025-10-01T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.717587 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.717665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.717683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.717713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.717736 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:17Z","lastTransitionTime":"2025-10-01T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.821511 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.821557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.821566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.821581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.821590 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:17Z","lastTransitionTime":"2025-10-01T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.923957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.923998 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.924013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.924028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:17 crc kubenswrapper[4735]: I1001 10:18:17.924038 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:17Z","lastTransitionTime":"2025-10-01T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.027443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.027583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.027605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.027994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.028231 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:18Z","lastTransitionTime":"2025-10-01T10:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.132278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.132372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.132400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.132432 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.132453 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:18Z","lastTransitionTime":"2025-10-01T10:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.235863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.235936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.235954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.235982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.236004 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:18Z","lastTransitionTime":"2025-10-01T10:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.339055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.339133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.339160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.339196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.339225 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:18Z","lastTransitionTime":"2025-10-01T10:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.442102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.442173 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.442193 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.442223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.442244 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:18Z","lastTransitionTime":"2025-10-01T10:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.545808 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.545874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.545893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.545920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.545941 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:18Z","lastTransitionTime":"2025-10-01T10:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.650183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.650309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.650328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.650358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.650409 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:18Z","lastTransitionTime":"2025-10-01T10:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.753227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.753286 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.753304 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.753329 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.753347 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:18Z","lastTransitionTime":"2025-10-01T10:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.856299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.856354 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.856367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.856390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.856405 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:18Z","lastTransitionTime":"2025-10-01T10:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.896368 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.896424 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.896426 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.896482 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:18 crc kubenswrapper[4735]: E1001 10:18:18.896622 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:18 crc kubenswrapper[4735]: E1001 10:18:18.896797 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:18 crc kubenswrapper[4735]: E1001 10:18:18.897063 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:18 crc kubenswrapper[4735]: E1001 10:18:18.897129 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.959179 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.959215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.959243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.959260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:18 crc kubenswrapper[4735]: I1001 10:18:18.959268 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:18Z","lastTransitionTime":"2025-10-01T10:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.062109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.062162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.062175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.062195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.062208 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:19Z","lastTransitionTime":"2025-10-01T10:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.165020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.165077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.165091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.165111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.165124 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:19Z","lastTransitionTime":"2025-10-01T10:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.267431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.267487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.267526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.267548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.267562 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:19Z","lastTransitionTime":"2025-10-01T10:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.371021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.371110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.371136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.371174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.371202 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:19Z","lastTransitionTime":"2025-10-01T10:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.474932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.475557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.475726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.475897 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.476108 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:19Z","lastTransitionTime":"2025-10-01T10:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.579022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.579104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.579122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.579155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.579177 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:19Z","lastTransitionTime":"2025-10-01T10:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.682457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.682568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.682588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.682615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.682634 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:19Z","lastTransitionTime":"2025-10-01T10:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.786028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.786136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.786163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.786202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.786227 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:19Z","lastTransitionTime":"2025-10-01T10:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.889360 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.889423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.889441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.889464 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.889482 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:19Z","lastTransitionTime":"2025-10-01T10:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.991941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.991988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.992003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.992023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:19 crc kubenswrapper[4735]: I1001 10:18:19.992038 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:19Z","lastTransitionTime":"2025-10-01T10:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.094724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.094787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.094810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.094827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.094838 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:20Z","lastTransitionTime":"2025-10-01T10:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.198267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.198373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.198395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.198426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.198445 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:20Z","lastTransitionTime":"2025-10-01T10:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.301748 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.301826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.301846 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.301878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.301900 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:20Z","lastTransitionTime":"2025-10-01T10:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.404356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.404398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.404409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.404425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.404435 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:20Z","lastTransitionTime":"2025-10-01T10:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.506544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.506576 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.506585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.506597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.506605 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:20Z","lastTransitionTime":"2025-10-01T10:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.608570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.608599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.608607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.608618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.608627 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:20Z","lastTransitionTime":"2025-10-01T10:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.711838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.711889 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.711903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.711929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.711950 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:20Z","lastTransitionTime":"2025-10-01T10:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.814605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.814667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.814683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.814712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.814731 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:20Z","lastTransitionTime":"2025-10-01T10:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.896846 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.896866 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.896984 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.896984 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:20 crc kubenswrapper[4735]: E1001 10:18:20.897173 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:20 crc kubenswrapper[4735]: E1001 10:18:20.897272 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:20 crc kubenswrapper[4735]: E1001 10:18:20.897316 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:20 crc kubenswrapper[4735]: E1001 10:18:20.897417 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.917977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.918067 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.918092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.918128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:20 crc kubenswrapper[4735]: I1001 10:18:20.918150 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:20Z","lastTransitionTime":"2025-10-01T10:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.020441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.020531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.020555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.020582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.020602 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.123810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.123886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.123904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.123935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.123986 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.227347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.227409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.227422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.227447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.227463 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.331004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.331069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.331081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.331103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.331120 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.434835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.434898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.434914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.434940 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.434959 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.538676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.538723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.538731 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.538749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.538760 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.601211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.601281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.601302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.601327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.601344 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: E1001 10:18:21.618791 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:21Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.625412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.625474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.625505 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.625526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.625544 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: E1001 10:18:21.641712 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:21Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.646004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.646043 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.646053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.646068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.646080 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: E1001 10:18:21.657478 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:21Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.661624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.661654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.661663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.661676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.661687 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: E1001 10:18:21.674841 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:21Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.678913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.678953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.678962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.678977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.678987 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: E1001 10:18:21.690090 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:21Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:21 crc kubenswrapper[4735]: E1001 10:18:21.690262 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.692327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.692382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.692398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.692424 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.692440 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.794855 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.794908 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.794917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.794931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.794957 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.897550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.898631 4735 scope.go:117] "RemoveContainer" containerID="24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274" Oct 01 10:18:21 crc kubenswrapper[4735]: E1001 10:18:21.899479 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.899695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.899723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.900729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.900950 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:21Z","lastTransitionTime":"2025-10-01T10:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.922304 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:21Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.939346 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:21Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.956720 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2be33635-a388-4c92-a1e6-9a0736dbbe79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07aa8a6d6d8611abb4a1531fc38c8bbdca73cf532d2f45645fcfb0eaeb92fec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e93060fbaf012ed4f8aec1624caf86aa986cea91a7ba97b7fb1dffa87343092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accb544db17576f62271088d4e35c414c0ed291197cf7d953466d7634edff0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:21Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:21 crc kubenswrapper[4735]: I1001 10:18:21.971436 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:21Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.002568 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:07Z\\\",\\\"message\\\":\\\"kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-qm6mr]\\\\nF1001 10:18:07.648229 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z]\\\\nI1001 10:18:07.648215 6415 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:21Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.004000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.004070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.004090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.004118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.004141 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:22Z","lastTransitionTime":"2025-10-01T10:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.021119 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.037887 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.051090 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.065714 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.077033 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.089330 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.102847 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.107072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.107113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.107130 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.107151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.107164 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:22Z","lastTransitionTime":"2025-10-01T10:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.115444 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.133466 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.148623 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.170313 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.187164 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:22Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.209936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.209989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.210003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.210022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.210038 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:22Z","lastTransitionTime":"2025-10-01T10:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.312619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.312713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.312730 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.312761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.312775 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:22Z","lastTransitionTime":"2025-10-01T10:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.419534 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.419688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.419710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.419737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.419754 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:22Z","lastTransitionTime":"2025-10-01T10:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.523416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.523470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.523480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.523517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.523532 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:22Z","lastTransitionTime":"2025-10-01T10:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.626581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.626632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.626641 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.626664 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.626677 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:22Z","lastTransitionTime":"2025-10-01T10:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.731241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.731315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.731333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.731364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.731382 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:22Z","lastTransitionTime":"2025-10-01T10:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.834899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.834980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.835000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.835032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.835051 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:22Z","lastTransitionTime":"2025-10-01T10:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.896838 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.896853 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:22 crc kubenswrapper[4735]: E1001 10:18:22.897127 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.896878 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:22 crc kubenswrapper[4735]: E1001 10:18:22.897300 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.896893 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:22 crc kubenswrapper[4735]: E1001 10:18:22.897413 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:22 crc kubenswrapper[4735]: E1001 10:18:22.897563 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.938255 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.938318 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.938338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.938365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:22 crc kubenswrapper[4735]: I1001 10:18:22.938384 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:22Z","lastTransitionTime":"2025-10-01T10:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.042374 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.042441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.042459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.042524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.042545 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:23Z","lastTransitionTime":"2025-10-01T10:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.144850 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.144889 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.144898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.144913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.144923 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:23Z","lastTransitionTime":"2025-10-01T10:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.247257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.247293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.247301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.247314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.247323 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:23Z","lastTransitionTime":"2025-10-01T10:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.350476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.350584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.350608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.350677 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.350735 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:23Z","lastTransitionTime":"2025-10-01T10:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.453449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.453533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.453545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.453558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.453568 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:23Z","lastTransitionTime":"2025-10-01T10:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.556910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.556959 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.556970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.556990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.557003 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:23Z","lastTransitionTime":"2025-10-01T10:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.660151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.660194 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.660202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.660218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.660227 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:23Z","lastTransitionTime":"2025-10-01T10:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.763292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.763331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.763342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.763358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.763370 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:23Z","lastTransitionTime":"2025-10-01T10:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.866624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.866667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.866680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.866698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.866712 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:23Z","lastTransitionTime":"2025-10-01T10:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.969386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.969446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.969455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.969469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:23 crc kubenswrapper[4735]: I1001 10:18:23.969479 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:23Z","lastTransitionTime":"2025-10-01T10:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.072122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.072162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.072174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.072187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.072196 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:24Z","lastTransitionTime":"2025-10-01T10:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.174479 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.174539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.174553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.174571 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.174584 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:24Z","lastTransitionTime":"2025-10-01T10:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.276920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.276986 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.277011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.277038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.277055 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:24Z","lastTransitionTime":"2025-10-01T10:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.379367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.379396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.379403 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.379415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.379423 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:24Z","lastTransitionTime":"2025-10-01T10:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.482176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.482213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.482221 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.482234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.482244 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:24Z","lastTransitionTime":"2025-10-01T10:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.585073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.585126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.585146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.585169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.585186 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:24Z","lastTransitionTime":"2025-10-01T10:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.688139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.688214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.688249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.688267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.688285 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:24Z","lastTransitionTime":"2025-10-01T10:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.791523 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.791580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.791592 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.791609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.791622 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:24Z","lastTransitionTime":"2025-10-01T10:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.894423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.894483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.894516 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.894536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.894551 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:24Z","lastTransitionTime":"2025-10-01T10:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.896630 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.896664 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.896688 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.896690 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:24 crc kubenswrapper[4735]: E1001 10:18:24.896734 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:24 crc kubenswrapper[4735]: E1001 10:18:24.896872 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:24 crc kubenswrapper[4735]: E1001 10:18:24.896983 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:24 crc kubenswrapper[4735]: E1001 10:18:24.897050 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.998165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.998216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.998227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.998243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:24 crc kubenswrapper[4735]: I1001 10:18:24.998254 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:24Z","lastTransitionTime":"2025-10-01T10:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.100317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.100367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.100379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.100397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.100413 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:25Z","lastTransitionTime":"2025-10-01T10:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.203159 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.203278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.203290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.203312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.203329 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:25Z","lastTransitionTime":"2025-10-01T10:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.306153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.306182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.306190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.306202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.306212 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:25Z","lastTransitionTime":"2025-10-01T10:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.408760 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.408799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.408810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.408826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.408836 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:25Z","lastTransitionTime":"2025-10-01T10:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.511022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.511055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.511080 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.511092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.511101 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:25Z","lastTransitionTime":"2025-10-01T10:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.614331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.614380 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.614390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.614404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.614414 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:25Z","lastTransitionTime":"2025-10-01T10:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.718038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.718100 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.718113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.718135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.718150 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:25Z","lastTransitionTime":"2025-10-01T10:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.820725 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.820772 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.820783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.820799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.820813 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:25Z","lastTransitionTime":"2025-10-01T10:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.923726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.923770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.923778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.923793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:25 crc kubenswrapper[4735]: I1001 10:18:25.923802 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:25Z","lastTransitionTime":"2025-10-01T10:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.026620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.026664 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.026675 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.026690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.026699 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:26Z","lastTransitionTime":"2025-10-01T10:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.129136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.129197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.129210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.129228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.129239 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:26Z","lastTransitionTime":"2025-10-01T10:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.232354 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.232413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.232427 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.232449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.232465 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:26Z","lastTransitionTime":"2025-10-01T10:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.335127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.335165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.335176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.335192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.335203 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:26Z","lastTransitionTime":"2025-10-01T10:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.437595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.437630 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.437637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.437650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.437659 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:26Z","lastTransitionTime":"2025-10-01T10:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.540868 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.541008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.541032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.541093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.541114 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:26Z","lastTransitionTime":"2025-10-01T10:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.644531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.644590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.644603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.644625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.644638 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:26Z","lastTransitionTime":"2025-10-01T10:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.747370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.747550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.747575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.747650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.747681 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:26Z","lastTransitionTime":"2025-10-01T10:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.851070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.851154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.851166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.851189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.851201 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:26Z","lastTransitionTime":"2025-10-01T10:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.896070 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.896106 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.896150 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:26 crc kubenswrapper[4735]: E1001 10:18:26.896222 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.896272 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:26 crc kubenswrapper[4735]: E1001 10:18:26.896388 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:26 crc kubenswrapper[4735]: E1001 10:18:26.896426 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:26 crc kubenswrapper[4735]: E1001 10:18:26.896483 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.953770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.953833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.953844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.953860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:26 crc kubenswrapper[4735]: I1001 10:18:26.953870 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:26Z","lastTransitionTime":"2025-10-01T10:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.055984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.056601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.056632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.056652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.056669 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:27Z","lastTransitionTime":"2025-10-01T10:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.159242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.159282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.159292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.159308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.159317 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:27Z","lastTransitionTime":"2025-10-01T10:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.235449 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:27 crc kubenswrapper[4735]: E1001 10:18:27.235637 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:18:27 crc kubenswrapper[4735]: E1001 10:18:27.235730 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs podName:77b56a8b-1a27-4727-b45e-43fbc3847ddd nodeName:}" failed. No retries permitted until 2025-10-01 10:18:59.235711839 +0000 UTC m=+97.928533101 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs") pod "network-metrics-daemon-qm6mr" (UID: "77b56a8b-1a27-4727-b45e-43fbc3847ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.261552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.261603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.261615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.261635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.261647 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:27Z","lastTransitionTime":"2025-10-01T10:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.364593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.364663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.364677 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.364698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.365020 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:27Z","lastTransitionTime":"2025-10-01T10:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.467654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.467696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.467707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.467725 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.467734 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:27Z","lastTransitionTime":"2025-10-01T10:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.570143 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.570185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.570193 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.570210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.570219 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:27Z","lastTransitionTime":"2025-10-01T10:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.672873 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.672920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.672935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.672952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.672964 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:27Z","lastTransitionTime":"2025-10-01T10:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.775551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.775582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.775592 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.775605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.775615 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:27Z","lastTransitionTime":"2025-10-01T10:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.879098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.879138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.879147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.879166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.879177 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:27Z","lastTransitionTime":"2025-10-01T10:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.981815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.981883 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.981903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.981932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:27 crc kubenswrapper[4735]: I1001 10:18:27.981956 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:27Z","lastTransitionTime":"2025-10-01T10:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.084466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.084526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.084535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.084549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.084559 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:28Z","lastTransitionTime":"2025-10-01T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.186640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.186684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.186698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.186735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.186751 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:28Z","lastTransitionTime":"2025-10-01T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.289036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.289095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.289105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.289119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.289131 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:28Z","lastTransitionTime":"2025-10-01T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.390797 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.390834 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.390844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.390858 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.390868 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:28Z","lastTransitionTime":"2025-10-01T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.493338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.493379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.493419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.493438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.493448 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:28Z","lastTransitionTime":"2025-10-01T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.596288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.596318 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.596326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.596337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.596345 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:28Z","lastTransitionTime":"2025-10-01T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.698793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.698845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.698857 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.698874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.698885 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:28Z","lastTransitionTime":"2025-10-01T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.801278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.801311 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.801319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.801334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.801342 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:28Z","lastTransitionTime":"2025-10-01T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.896682 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:28 crc kubenswrapper[4735]: E1001 10:18:28.896806 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.896860 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:28 crc kubenswrapper[4735]: E1001 10:18:28.897041 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.897129 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.897163 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:28 crc kubenswrapper[4735]: E1001 10:18:28.897223 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:28 crc kubenswrapper[4735]: E1001 10:18:28.897276 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.902982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.903010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.903019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.903031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:28 crc kubenswrapper[4735]: I1001 10:18:28.903040 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:28Z","lastTransitionTime":"2025-10-01T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.005827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.005856 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.005865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.005879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.005888 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:29Z","lastTransitionTime":"2025-10-01T10:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.107335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.107378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.107390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.107407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.107420 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:29Z","lastTransitionTime":"2025-10-01T10:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.210141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.210181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.210193 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.210207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.210216 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:29Z","lastTransitionTime":"2025-10-01T10:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.312164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.312226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.312236 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.312251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.312261 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:29Z","lastTransitionTime":"2025-10-01T10:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.414271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.414324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.414333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.414345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.414353 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:29Z","lastTransitionTime":"2025-10-01T10:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.516532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.516582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.516593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.516609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.516619 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:29Z","lastTransitionTime":"2025-10-01T10:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.618933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.618969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.618977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.618989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.618996 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:29Z","lastTransitionTime":"2025-10-01T10:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.721222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.721266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.721274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.721323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.721333 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:29Z","lastTransitionTime":"2025-10-01T10:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.823785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.824041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.824136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.824256 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.824329 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:29Z","lastTransitionTime":"2025-10-01T10:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.926432 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.926474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.926483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.926520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:29 crc kubenswrapper[4735]: I1001 10:18:29.926530 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:29Z","lastTransitionTime":"2025-10-01T10:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.028614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.028650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.028658 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.028672 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.028683 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:30Z","lastTransitionTime":"2025-10-01T10:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.131375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.132015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.132085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.132167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.132235 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:30Z","lastTransitionTime":"2025-10-01T10:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.235192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.235229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.235240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.235255 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.235265 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:30Z","lastTransitionTime":"2025-10-01T10:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.337733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.337787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.337801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.337825 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.337841 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:30Z","lastTransitionTime":"2025-10-01T10:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.440033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.440328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.440451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.440616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.440735 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:30Z","lastTransitionTime":"2025-10-01T10:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.543466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.543519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.543531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.543545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.543554 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:30Z","lastTransitionTime":"2025-10-01T10:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.646788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.646860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.646875 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.646891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.646905 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:30Z","lastTransitionTime":"2025-10-01T10:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.749034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.749070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.749078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.749092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.749100 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:30Z","lastTransitionTime":"2025-10-01T10:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.851699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.851738 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.851750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.851766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.851778 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:30Z","lastTransitionTime":"2025-10-01T10:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.896647 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.896701 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.896656 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:30 crc kubenswrapper[4735]: E1001 10:18:30.896767 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:30 crc kubenswrapper[4735]: E1001 10:18:30.896886 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:30 crc kubenswrapper[4735]: E1001 10:18:30.896967 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.897146 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:30 crc kubenswrapper[4735]: E1001 10:18:30.897334 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.954254 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.954290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.954303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.954317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:30 crc kubenswrapper[4735]: I1001 10:18:30.954328 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:30Z","lastTransitionTime":"2025-10-01T10:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.056658 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.056703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.056715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.056733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.056744 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:31Z","lastTransitionTime":"2025-10-01T10:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.159203 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.159243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.159253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.159267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.159277 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:31Z","lastTransitionTime":"2025-10-01T10:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.254334 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dz9b_5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7/kube-multus/0.log" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.254381 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7" containerID="cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23" exitCode=1 Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.254411 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dz9b" event={"ID":"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7","Type":"ContainerDied","Data":"cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.254866 4735 scope.go:117] "RemoveContainer" containerID="cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.260668 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.260713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.260727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.260751 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.260772 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:31Z","lastTransitionTime":"2025-10-01T10:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.268879 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.287687 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2be33635-a388-4c92-a1e6-9a0736dbbe79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07aa8a6d6d8611abb4a1531fc38c8bbdca73cf532d2f45645fcfb0eaeb92fec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e93060fbaf012ed4f8aec1624caf86aa986cea91a7ba97b7fb1dffa87343092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accb544db17576f62271088d4e35c414c0ed291197cf7d953466d7634edff0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.297523 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.317251 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:07Z\\\",\\\"message\\\":\\\"kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-qm6mr]\\\\nF1001 10:18:07.648229 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z]\\\\nI1001 10:18:07.648215 6415 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.327240 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.338465 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.351934 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.363324 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.365351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.365389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.365403 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.365426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.365438 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:31Z","lastTransitionTime":"2025-10-01T10:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.376720 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:30Z\\\",\\\"message\\\":\\\"2025-10-01T10:17:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93\\\\n2025-10-01T10:17:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93 to /host/opt/cni/bin/\\\\n2025-10-01T10:17:45Z [verbose] multus-daemon started\\\\n2025-10-01T10:17:45Z [verbose] Readiness Indicator file check\\\\n2025-10-01T10:18:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.385575 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.397266 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.408401 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.417524 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.428316 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.439757 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.452883 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.464159 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.467911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.467959 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.467972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.467989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.468001 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:31Z","lastTransitionTime":"2025-10-01T10:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.570734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.570767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.570776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.570789 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.570800 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:31Z","lastTransitionTime":"2025-10-01T10:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.673522 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.673550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.673559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.673571 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.673581 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:31Z","lastTransitionTime":"2025-10-01T10:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.775964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.775996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.776041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.776056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.776066 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:31Z","lastTransitionTime":"2025-10-01T10:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.878477 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.878524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.878552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.878566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.878575 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:31Z","lastTransitionTime":"2025-10-01T10:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.908324 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.920892 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.931323 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.943611 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:30Z\\\",\\\"message\\\":\\\"2025-10-01T10:17:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93\\\\n2025-10-01T10:17:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93 to /host/opt/cni/bin/\\\\n2025-10-01T10:17:45Z [verbose] multus-daemon started\\\\n2025-10-01T10:17:45Z [verbose] Readiness Indicator file check\\\\n2025-10-01T10:18:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.953674 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.967627 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.980190 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.980372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.980470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.980483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.980518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.980533 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:31Z","lastTransitionTime":"2025-10-01T10:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:31 crc kubenswrapper[4735]: I1001 10:18:31.995289 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:31Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.009082 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.020528 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.035849 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.045969 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.057000 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.069041 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2be33635-a388-4c92-a1e6-9a0736dbbe79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07aa8a6d6d8611abb4a1531fc38c8bbdca73cf532d2f45645fcfb0eaeb92fec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e93060fbaf012ed4f8aec1624caf86aa986cea91a7ba97b7fb1dffa87343092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accb544db17576f62271088d4e35c414c0ed291197cf7d953466d7634edff0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.073845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.073884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.073895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.073911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.073923 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.078772 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: E1001 10:18:32.084951 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.088090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.088115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.088124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.088137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.088146 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.098632 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:07Z\\\",\\\"message\\\":\\\"kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-qm6mr]\\\\nF1001 10:18:07.648229 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z]\\\\nI1001 10:18:07.648215 6415 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: E1001 10:18:32.098652 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.102331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.102475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.102598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.102686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.102769 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.110665 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: E1001 10:18:32.114330 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.118355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.118378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.118388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.118402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.118412 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: E1001 10:18:32.130030 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.132679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.132707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.132715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.132732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.132741 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: E1001 10:18:32.142756 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: E1001 10:18:32.142867 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.143987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.144028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.144041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.144058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.144071 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.245910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.245943 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.245952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.245964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.245972 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.258102 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dz9b_5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7/kube-multus/0.log" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.258160 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dz9b" event={"ID":"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7","Type":"ContainerStarted","Data":"1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.268348 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.282062 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.294735 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.307600 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.318246 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.330372 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2be33635-a388-4c92-a1e6-9a0736dbbe79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07aa8a6d6d8611abb4a1531fc38c8bbdca73cf532d2f45645fcfb0eaeb92fec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e93060fbaf012ed4f8aec1624caf86aa986cea91a7ba97b7fb1dffa87343092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accb544db17576f62271088d4e35c414c0ed291197cf7d953466d7634edff0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.340978 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.347927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.347962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.347974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.347990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.348005 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.360219 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:07Z\\\",\\\"message\\\":\\\"kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-qm6mr]\\\\nF1001 10:18:07.648229 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z]\\\\nI1001 10:18:07.648215 6415 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.369422 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.380301 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.388957 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.400357 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:30Z\\\",\\\"message\\\":\\\"2025-10-01T10:17:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93\\\\n2025-10-01T10:17:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93 to /host/opt/cni/bin/\\\\n2025-10-01T10:17:45Z [verbose] multus-daemon started\\\\n2025-10-01T10:17:45Z [verbose] Readiness Indicator file check\\\\n2025-10-01T10:18:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:18:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.409035 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.421418 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.431623 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.441765 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.450570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.450609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.450618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.450633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.450642 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.454775 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:32Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.553055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.553090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.553102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.553117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.553128 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.655586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.655623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.655634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.655650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.655661 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.757557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.757597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.757609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.757624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.757636 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.859674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.859709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.859720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.859736 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.859747 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.896945 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.896979 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.897309 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.897380 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:32 crc kubenswrapper[4735]: E1001 10:18:32.897483 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.897542 4735 scope.go:117] "RemoveContainer" containerID="24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274" Oct 01 10:18:32 crc kubenswrapper[4735]: E1001 10:18:32.897683 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:32 crc kubenswrapper[4735]: E1001 10:18:32.898227 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:32 crc kubenswrapper[4735]: E1001 10:18:32.898523 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.962200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.962242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.962255 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.962273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:32 crc kubenswrapper[4735]: I1001 10:18:32.962285 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:32Z","lastTransitionTime":"2025-10-01T10:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.064803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.064822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.064829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.064842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.064850 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:33Z","lastTransitionTime":"2025-10-01T10:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.167296 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.167615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.167625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.167665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.167675 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:33Z","lastTransitionTime":"2025-10-01T10:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.262629 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/2.log" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.266488 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0"} Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.267013 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.269810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.269838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.269849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.269865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.269877 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:33Z","lastTransitionTime":"2025-10-01T10:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.276268 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.294536 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:07Z\\\",\\\"message\\\":\\\"kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-qm6mr]\\\\nF1001 10:18:07.648229 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z]\\\\nI1001 10:18:07.648215 6415 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:18:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.304976 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.316113 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.329785 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2be33635-a388-4c92-a1e6-9a0736dbbe79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07aa8a6d6d8611abb4a1531fc38c8bbdca73cf532d2f45645fcfb0eaeb92fec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e93060fbaf012ed4f8aec1624caf86aa986cea91a7ba97b7fb1dffa87343092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accb544db17576f62271088d4e35c414c0ed291197cf7d953466d7634edff0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.338479 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.348742 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.360442 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.371832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.371870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.371880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.371895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.371906 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:33Z","lastTransitionTime":"2025-10-01T10:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.373249 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.390879 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:30Z\\\",\\\"message\\\":\\\"2025-10-01T10:17:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93\\\\n2025-10-01T10:17:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93 to /host/opt/cni/bin/\\\\n2025-10-01T10:17:45Z [verbose] multus-daemon started\\\\n2025-10-01T10:17:45Z [verbose] Readiness Indicator file check\\\\n2025-10-01T10:18:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:18:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.405297 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.419290 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.441952 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.459982 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.471293 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.473810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.473838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.473847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.473863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.473872 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:33Z","lastTransitionTime":"2025-10-01T10:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.485433 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.496102 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:33Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.575893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.575932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.575943 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.575958 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.575970 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:33Z","lastTransitionTime":"2025-10-01T10:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.677872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.677909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.677919 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.677935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.677946 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:33Z","lastTransitionTime":"2025-10-01T10:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.779991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.780023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.780031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.780044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.780053 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:33Z","lastTransitionTime":"2025-10-01T10:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.882534 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.882573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.882584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.882599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.882608 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:33Z","lastTransitionTime":"2025-10-01T10:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.984866 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.984913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.984922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.984937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:33 crc kubenswrapper[4735]: I1001 10:18:33.984950 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:33Z","lastTransitionTime":"2025-10-01T10:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.087247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.087318 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.087333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.087357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.087375 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:34Z","lastTransitionTime":"2025-10-01T10:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.190509 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.190561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.190570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.190584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.190598 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:34Z","lastTransitionTime":"2025-10-01T10:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.270634 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/3.log" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.271120 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/2.log" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.273193 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" exitCode=1 Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.273228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0"} Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.273263 4735 scope.go:117] "RemoveContainer" containerID="24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.274274 4735 scope.go:117] "RemoveContainer" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" Oct 01 10:18:34 crc kubenswrapper[4735]: E1001 10:18:34.274446 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.288365 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.292154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.292190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.292201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.292215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.292225 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:34Z","lastTransitionTime":"2025-10-01T10:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.311125 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.322953 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.335854 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.347343 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.361181 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2be33635-a388-4c92-a1e6-9a0736dbbe79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07aa8a6d6d8611abb4a1531fc38c8bbdca73cf532d2f45645fcfb0eaeb92fec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e93060fbaf012ed4f8aec1624caf86aa986cea91a7ba97b7fb1dffa87343092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accb544db17576f62271088d4e35c414c0ed291197cf7d953466d7634edff0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.373992 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.393163 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24dc0b6402fc5d62cac780c184d5adcb576e58a3b564daef937eef52531a7274\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:07Z\\\",\\\"message\\\":\\\"kube-apiserver/kube-apiserver-crc openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-qm6mr]\\\\nF1001 10:18:07.648229 6415 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:07Z is after 2025-08-24T17:21:41Z]\\\\nI1001 10:18:07.648215 6415 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:33Z\\\",\\\"message\\\":\\\"001 10:18:33.600387 6760 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:18:33.600397 6760 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default\\\\nI1001 10:18:33.600387 6760 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 10:18:33.600409 6760 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 10:18:33.600424 6760 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:18:33.600455 6760 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 10:18:33.600517 6760 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.395764 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.395805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.395816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.395832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.395844 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:34Z","lastTransitionTime":"2025-10-01T10:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.407973 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.419899 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.431853 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.443699 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:30Z\\\",\\\"message\\\":\\\"2025-10-01T10:17:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93\\\\n2025-10-01T10:17:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93 to /host/opt/cni/bin/\\\\n2025-10-01T10:17:45Z [verbose] multus-daemon started\\\\n2025-10-01T10:17:45Z [verbose] Readiness Indicator file check\\\\n2025-10-01T10:18:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:18:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.453183 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.466134 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.479103 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.489540 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.498663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.498706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.498719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.498737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.498750 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:34Z","lastTransitionTime":"2025-10-01T10:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.502878 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:34Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.601116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.601148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.601159 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.601173 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.601183 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:34Z","lastTransitionTime":"2025-10-01T10:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.703798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.703831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.703839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.703854 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.703864 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:34Z","lastTransitionTime":"2025-10-01T10:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.806248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.806296 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.806309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.806326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.806340 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:34Z","lastTransitionTime":"2025-10-01T10:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.896972 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.897005 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.897024 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.897011 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:34 crc kubenswrapper[4735]: E1001 10:18:34.897119 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:34 crc kubenswrapper[4735]: E1001 10:18:34.897267 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:34 crc kubenswrapper[4735]: E1001 10:18:34.897356 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:34 crc kubenswrapper[4735]: E1001 10:18:34.897529 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.909244 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.909268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.909306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.909322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:34 crc kubenswrapper[4735]: I1001 10:18:34.909334 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:34Z","lastTransitionTime":"2025-10-01T10:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.012257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.012303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.012317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.012334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.012347 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:35Z","lastTransitionTime":"2025-10-01T10:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.114401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.114443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.114456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.114473 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.114484 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:35Z","lastTransitionTime":"2025-10-01T10:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.217186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.217458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.217551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.217636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.217703 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:35Z","lastTransitionTime":"2025-10-01T10:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.277630 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/3.log" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.280964 4735 scope.go:117] "RemoveContainer" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" Oct 01 10:18:35 crc kubenswrapper[4735]: E1001 10:18:35.281160 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.293194 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.303405 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.315783 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.319355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.319395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.319404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.319420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.319433 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:35Z","lastTransitionTime":"2025-10-01T10:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.327170 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.339432 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.351685 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.364526 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.378427 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.388846 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2be33635-a388-4c92-a1e6-9a0736dbbe79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07aa8a6d6d8611abb4a1531fc38c8bbdca73cf532d2f45645fcfb0eaeb92fec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e93060fbaf012ed4f8aec1624caf86aa986cea91a7ba97b7fb1dffa87343092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accb544db17576f62271088d4e35c414c0ed291197cf7d953466d7634edff0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.401270 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.420985 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:33Z\\\",\\\"message\\\":\\\"001 10:18:33.600387 6760 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:18:33.600397 6760 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default\\\\nI1001 10:18:33.600387 6760 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 10:18:33.600409 6760 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 10:18:33.600424 6760 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:18:33.600455 6760 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 10:18:33.600517 6760 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.421718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.421757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.421779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.421794 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.421804 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:35Z","lastTransitionTime":"2025-10-01T10:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.439329 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.454046 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.465891 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.477189 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:30Z\\\",\\\"message\\\":\\\"2025-10-01T10:17:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93\\\\n2025-10-01T10:17:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93 to /host/opt/cni/bin/\\\\n2025-10-01T10:17:45Z [verbose] multus-daemon started\\\\n2025-10-01T10:17:45Z [verbose] Readiness Indicator file check\\\\n2025-10-01T10:18:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:18:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.491647 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.505165 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:35Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.523961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.524001 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.524013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.524028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.524039 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:35Z","lastTransitionTime":"2025-10-01T10:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.626399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.626430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.626438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.626451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.626461 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:35Z","lastTransitionTime":"2025-10-01T10:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.728674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.728723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.728736 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.728755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.728767 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:35Z","lastTransitionTime":"2025-10-01T10:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.830900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.830937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.830944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.830958 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.830966 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:35Z","lastTransitionTime":"2025-10-01T10:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.933651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.933694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.933702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.933717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:35 crc kubenswrapper[4735]: I1001 10:18:35.933726 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:35Z","lastTransitionTime":"2025-10-01T10:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.036318 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.036352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.036363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.036375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.036383 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:36Z","lastTransitionTime":"2025-10-01T10:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.138050 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.138116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.138129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.138144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.138156 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:36Z","lastTransitionTime":"2025-10-01T10:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.240487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.240544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.240558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.240574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.240588 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:36Z","lastTransitionTime":"2025-10-01T10:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.342423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.342478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.342519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.342543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.342561 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:36Z","lastTransitionTime":"2025-10-01T10:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.444231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.444271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.444283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.444299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.444309 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:36Z","lastTransitionTime":"2025-10-01T10:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.546699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.546738 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.546750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.546772 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.546788 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:36Z","lastTransitionTime":"2025-10-01T10:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.648923 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.648964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.648978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.648994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.649005 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:36Z","lastTransitionTime":"2025-10-01T10:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.752423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.752515 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.752526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.752542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.752550 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:36Z","lastTransitionTime":"2025-10-01T10:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.854480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.854529 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.854541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.854556 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.854566 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:36Z","lastTransitionTime":"2025-10-01T10:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.895974 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.896004 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.895990 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.895974 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:36 crc kubenswrapper[4735]: E1001 10:18:36.896095 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:36 crc kubenswrapper[4735]: E1001 10:18:36.896184 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:36 crc kubenswrapper[4735]: E1001 10:18:36.896282 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:36 crc kubenswrapper[4735]: E1001 10:18:36.896345 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.957540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.957575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.957584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.957596 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:36 crc kubenswrapper[4735]: I1001 10:18:36.957604 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:36Z","lastTransitionTime":"2025-10-01T10:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.060247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.060278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.060286 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.060300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.060310 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:37Z","lastTransitionTime":"2025-10-01T10:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.162313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.162353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.162363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.162378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.162389 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:37Z","lastTransitionTime":"2025-10-01T10:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.265188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.265228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.265238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.265251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.265260 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:37Z","lastTransitionTime":"2025-10-01T10:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.368335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.368367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.368376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.368391 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.368402 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:37Z","lastTransitionTime":"2025-10-01T10:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.471234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.471277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.471286 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.471299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.471309 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:37Z","lastTransitionTime":"2025-10-01T10:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.574727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.574807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.574830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.574861 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.574883 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:37Z","lastTransitionTime":"2025-10-01T10:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.678066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.678112 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.678126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.678143 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.678156 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:37Z","lastTransitionTime":"2025-10-01T10:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.780705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.780743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.780754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.780767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.780778 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:37Z","lastTransitionTime":"2025-10-01T10:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.883116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.883149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.883160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.883174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.883183 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:37Z","lastTransitionTime":"2025-10-01T10:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.985534 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.985572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.985581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.985594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:37 crc kubenswrapper[4735]: I1001 10:18:37.985603 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:37Z","lastTransitionTime":"2025-10-01T10:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.088301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.088339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.088347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.088361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.088370 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:38Z","lastTransitionTime":"2025-10-01T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.191276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.191315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.191323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.191344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.191368 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:38Z","lastTransitionTime":"2025-10-01T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.293819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.293848 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.293856 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.293868 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.293878 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:38Z","lastTransitionTime":"2025-10-01T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.395816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.395873 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.395889 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.395912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.395928 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:38Z","lastTransitionTime":"2025-10-01T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.498975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.499013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.499023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.499042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.499053 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:38Z","lastTransitionTime":"2025-10-01T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.601720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.601747 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.601755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.601767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.601776 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:38Z","lastTransitionTime":"2025-10-01T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.704209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.704284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.704307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.704334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.704356 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:38Z","lastTransitionTime":"2025-10-01T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.807297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.807343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.807355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.807372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.807385 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:38Z","lastTransitionTime":"2025-10-01T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.896689 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.896742 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.896829 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:38 crc kubenswrapper[4735]: E1001 10:18:38.896969 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.896977 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:38 crc kubenswrapper[4735]: E1001 10:18:38.897070 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:38 crc kubenswrapper[4735]: E1001 10:18:38.897156 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:38 crc kubenswrapper[4735]: E1001 10:18:38.897252 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.909474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.909536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.909546 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.909560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:38 crc kubenswrapper[4735]: I1001 10:18:38.909571 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:38Z","lastTransitionTime":"2025-10-01T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.011189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.011271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.011280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.011292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.011301 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:39Z","lastTransitionTime":"2025-10-01T10:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.113415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.113448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.113456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.113469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.113477 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:39Z","lastTransitionTime":"2025-10-01T10:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.215832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.215863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.215871 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.215885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.215894 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:39Z","lastTransitionTime":"2025-10-01T10:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.318226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.318251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.318260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.318273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.318281 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:39Z","lastTransitionTime":"2025-10-01T10:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.420084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.420113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.420122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.420136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.420146 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:39Z","lastTransitionTime":"2025-10-01T10:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.522548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.522603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.522614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.522631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.522644 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:39Z","lastTransitionTime":"2025-10-01T10:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.625098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.625132 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.625142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.625158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.625167 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:39Z","lastTransitionTime":"2025-10-01T10:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.727263 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.727588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.727656 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.727723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.727790 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:39Z","lastTransitionTime":"2025-10-01T10:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.831534 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.831586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.831598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.831615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.831626 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:39Z","lastTransitionTime":"2025-10-01T10:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.934358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.934392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.934400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.934415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:39 crc kubenswrapper[4735]: I1001 10:18:39.934424 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:39Z","lastTransitionTime":"2025-10-01T10:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.037131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.037167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.037178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.037194 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.037204 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:40Z","lastTransitionTime":"2025-10-01T10:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.139763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.139804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.139814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.139829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.139839 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:40Z","lastTransitionTime":"2025-10-01T10:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.242520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.242607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.242624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.242645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.242657 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:40Z","lastTransitionTime":"2025-10-01T10:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.345379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.345434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.345445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.345458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.345466 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:40Z","lastTransitionTime":"2025-10-01T10:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.449268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.449330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.449362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.449389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.449407 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:40Z","lastTransitionTime":"2025-10-01T10:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.552645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.552710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.552722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.552766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.552780 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:40Z","lastTransitionTime":"2025-10-01T10:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.656219 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.656303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.656328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.656355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.656371 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:40Z","lastTransitionTime":"2025-10-01T10:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.760532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.760588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.760598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.760616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.760638 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:40Z","lastTransitionTime":"2025-10-01T10:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.863779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.863814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.863824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.863839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.863852 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:40Z","lastTransitionTime":"2025-10-01T10:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.896885 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:40 crc kubenswrapper[4735]: E1001 10:18:40.897000 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.897188 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.897207 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:40 crc kubenswrapper[4735]: E1001 10:18:40.897274 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.897346 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:40 crc kubenswrapper[4735]: E1001 10:18:40.897457 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:40 crc kubenswrapper[4735]: E1001 10:18:40.897529 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.966094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.966143 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.966156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.966171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:40 crc kubenswrapper[4735]: I1001 10:18:40.966182 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:40Z","lastTransitionTime":"2025-10-01T10:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.068470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.068521 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.068532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.068545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.068555 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:41Z","lastTransitionTime":"2025-10-01T10:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.170388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.170419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.170427 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.170442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.170450 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:41Z","lastTransitionTime":"2025-10-01T10:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.272344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.272384 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.272395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.272411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.272422 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:41Z","lastTransitionTime":"2025-10-01T10:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.374567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.374638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.374652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.374670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.374682 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:41Z","lastTransitionTime":"2025-10-01T10:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.477172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.477209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.477218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.477232 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.477241 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:41Z","lastTransitionTime":"2025-10-01T10:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.580073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.580117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.580129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.580143 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.580157 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:41Z","lastTransitionTime":"2025-10-01T10:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.683176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.683221 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.683231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.683246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.683257 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:41Z","lastTransitionTime":"2025-10-01T10:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.785910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.785949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.785960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.785975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.785985 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:41Z","lastTransitionTime":"2025-10-01T10:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.888703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.888759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.888770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.888788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.888801 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:41Z","lastTransitionTime":"2025-10-01T10:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.911703 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.925446 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.937957 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.953211 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.966591 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.980628 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.990057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.990088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.990096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.990126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.990137 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:41Z","lastTransitionTime":"2025-10-01T10:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:41 crc kubenswrapper[4735]: I1001 10:18:41.992791 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:41Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.012728 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:33Z\\\",\\\"message\\\":\\\"001 10:18:33.600387 6760 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:18:33.600397 6760 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default\\\\nI1001 10:18:33.600387 6760 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 10:18:33.600409 6760 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 10:18:33.600424 6760 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:18:33.600455 6760 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 10:18:33.600517 6760 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.024522 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.035790 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.046103 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2be33635-a388-4c92-a1e6-9a0736dbbe79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07aa8a6d6d8611abb4a1531fc38c8bbdca73cf532d2f45645fcfb0eaeb92fec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e93060fbaf012ed4f8aec1624caf86aa986cea91a7ba97b7fb1dffa87343092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accb544db17576f62271088d4e35c414c0ed291197cf7d953466d7634edff0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.056526 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.068639 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.081749 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.092801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.092835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.092844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.092857 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.092866 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.094040 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.106577 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:30Z\\\",\\\"message\\\":\\\"2025-10-01T10:17:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93\\\\n2025-10-01T10:17:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93 to /host/opt/cni/bin/\\\\n2025-10-01T10:17:45Z [verbose] multus-daemon started\\\\n2025-10-01T10:17:45Z [verbose] Readiness Indicator file check\\\\n2025-10-01T10:18:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:18:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.117374 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.195396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.195439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.195449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.195464 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.195634 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.298459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.298490 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.298512 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.298526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.298536 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.380076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.380124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.380140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.380160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.380177 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: E1001 10:18:42.392716 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.395886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.395917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.396012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.396033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.396045 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: E1001 10:18:42.407278 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.410594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.410625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.410636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.410651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.410660 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: E1001 10:18:42.421362 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.425331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.425367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.425375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.425390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.425400 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: E1001 10:18:42.439195 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.442525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.442778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.442846 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.442904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.442962 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: E1001 10:18:42.453295 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:42Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:42 crc kubenswrapper[4735]: E1001 10:18:42.453467 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.455269 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.455296 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.455308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.455323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.455334 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.557643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.557695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.557703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.557718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.557730 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.660627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.660665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.660676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.660693 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.660703 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.762891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.762924 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.762935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.762949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.762959 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.865079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.865114 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.865124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.865140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.865148 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.896309 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.896379 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:42 crc kubenswrapper[4735]: E1001 10:18:42.896412 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.896309 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.896483 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:42 crc kubenswrapper[4735]: E1001 10:18:42.896487 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:42 crc kubenswrapper[4735]: E1001 10:18:42.896750 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:42 crc kubenswrapper[4735]: E1001 10:18:42.896841 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.967389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.967447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.967459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.967473 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:42 crc kubenswrapper[4735]: I1001 10:18:42.967516 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:42Z","lastTransitionTime":"2025-10-01T10:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.069801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.069842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.069850 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.069864 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.069873 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:43Z","lastTransitionTime":"2025-10-01T10:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.171869 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.171907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.171915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.171931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.171944 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:43Z","lastTransitionTime":"2025-10-01T10:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.274748 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.274789 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.274801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.274819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.274834 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:43Z","lastTransitionTime":"2025-10-01T10:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.377952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.378031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.378058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.378457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.378485 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:43Z","lastTransitionTime":"2025-10-01T10:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.481058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.481119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.481136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.481159 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.481177 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:43Z","lastTransitionTime":"2025-10-01T10:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.584584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.584671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.584692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.584717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.584740 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:43Z","lastTransitionTime":"2025-10-01T10:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.687134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.687176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.687189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.687205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.687223 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:43Z","lastTransitionTime":"2025-10-01T10:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.789683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.789724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.789732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.789767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.789776 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:43Z","lastTransitionTime":"2025-10-01T10:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.892040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.892101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.892124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.892148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.892167 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:43Z","lastTransitionTime":"2025-10-01T10:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.995353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.995418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.995442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.995472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:43 crc kubenswrapper[4735]: I1001 10:18:43.995490 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:43Z","lastTransitionTime":"2025-10-01T10:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.098041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.098075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.098083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.098096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.098107 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:44Z","lastTransitionTime":"2025-10-01T10:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.200434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.200487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.200523 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.200543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.200555 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:44Z","lastTransitionTime":"2025-10-01T10:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.302743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.302793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.302805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.302821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.302833 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:44Z","lastTransitionTime":"2025-10-01T10:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.404911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.404953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.404961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.404977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.404987 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:44Z","lastTransitionTime":"2025-10-01T10:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.507184 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.507226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.507237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.507251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.507261 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:44Z","lastTransitionTime":"2025-10-01T10:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.609723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.609768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.609783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.609804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.609816 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:44Z","lastTransitionTime":"2025-10-01T10:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.712278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.712329 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.712355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.712369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.712378 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:44Z","lastTransitionTime":"2025-10-01T10:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.816029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.816098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.816117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.816151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.816173 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:44Z","lastTransitionTime":"2025-10-01T10:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.896045 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.896119 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.896115 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.896140 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.896238 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.896364 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.896579 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.896692 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.898533 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.898667 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.898696 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.898734 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.898754 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.898807 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.898816 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:48.898786497 +0000 UTC m=+147.591607799 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.898860 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:19:48.898840048 +0000 UTC m=+147.591661450 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.898893 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.898903 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.898911 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.898940 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 10:19:48.89892811 +0000 UTC m=+147.591749372 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.899039 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.899063 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.899082 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.899139 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 10:19:48.899120445 +0000 UTC m=+147.591941788 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.899147 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:18:44 crc kubenswrapper[4735]: E1001 10:18:44.899177 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 10:19:48.899168337 +0000 UTC m=+147.591989599 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.918662 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.918740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.918757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.918781 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:44 crc kubenswrapper[4735]: I1001 10:18:44.918799 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:44Z","lastTransitionTime":"2025-10-01T10:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.022294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.022342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.022352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.022370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.022530 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:45Z","lastTransitionTime":"2025-10-01T10:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.126726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.126807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.126820 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.126839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.126853 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:45Z","lastTransitionTime":"2025-10-01T10:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.229788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.229847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.229866 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.229890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.229907 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:45Z","lastTransitionTime":"2025-10-01T10:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.332238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.332304 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.332321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.332346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.332369 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:45Z","lastTransitionTime":"2025-10-01T10:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.435202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.435258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.435269 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.435285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.435294 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:45Z","lastTransitionTime":"2025-10-01T10:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.539029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.539085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.539094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.539110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.539119 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:45Z","lastTransitionTime":"2025-10-01T10:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.641519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.641555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.641565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.641584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.641597 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:45Z","lastTransitionTime":"2025-10-01T10:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.743838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.743877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.743888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.743907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.743917 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:45Z","lastTransitionTime":"2025-10-01T10:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.846344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.846414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.846429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.846446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.846458 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:45Z","lastTransitionTime":"2025-10-01T10:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.949445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.949572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.949600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.949632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:45 crc kubenswrapper[4735]: I1001 10:18:45.949656 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:45Z","lastTransitionTime":"2025-10-01T10:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.053087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.053125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.053134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.053165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.053177 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:46Z","lastTransitionTime":"2025-10-01T10:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.156540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.156584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.156592 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.156607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.156618 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:46Z","lastTransitionTime":"2025-10-01T10:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.260111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.260186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.260208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.260236 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.260258 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:46Z","lastTransitionTime":"2025-10-01T10:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.363396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.363472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.363547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.363574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.363593 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:46Z","lastTransitionTime":"2025-10-01T10:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.465715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.465766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.465777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.465797 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.465810 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:46Z","lastTransitionTime":"2025-10-01T10:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.568424 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.568485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.568518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.568532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.568542 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:46Z","lastTransitionTime":"2025-10-01T10:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.671576 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.671620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.671634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.671652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.671664 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:46Z","lastTransitionTime":"2025-10-01T10:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.773520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.773572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.773584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.773601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.773612 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:46Z","lastTransitionTime":"2025-10-01T10:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.876137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.876177 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.876187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.876201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.876211 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:46Z","lastTransitionTime":"2025-10-01T10:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.896976 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.897002 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.897002 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.897021 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:46 crc kubenswrapper[4735]: E1001 10:18:46.897108 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:46 crc kubenswrapper[4735]: E1001 10:18:46.897218 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:46 crc kubenswrapper[4735]: E1001 10:18:46.897305 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:46 crc kubenswrapper[4735]: E1001 10:18:46.897386 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.977985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.978017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.978025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.978039 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:46 crc kubenswrapper[4735]: I1001 10:18:46.978047 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:46Z","lastTransitionTime":"2025-10-01T10:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.081603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.081672 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.081689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.081713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.081730 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:47Z","lastTransitionTime":"2025-10-01T10:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.184860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.185267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.185281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.185300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.185315 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:47Z","lastTransitionTime":"2025-10-01T10:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.287489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.287582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.287596 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.287611 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.287623 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:47Z","lastTransitionTime":"2025-10-01T10:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.389941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.389981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.389992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.390007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.390019 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:47Z","lastTransitionTime":"2025-10-01T10:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.492274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.492314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.492325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.492340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.492353 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:47Z","lastTransitionTime":"2025-10-01T10:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.594465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.594549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.594562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.594579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.594592 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:47Z","lastTransitionTime":"2025-10-01T10:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.696422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.696457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.696466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.696479 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.696488 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:47Z","lastTransitionTime":"2025-10-01T10:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.798905 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.798950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.798962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.798982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.798995 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:47Z","lastTransitionTime":"2025-10-01T10:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.902368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.902413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.902425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.902440 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.902451 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:47Z","lastTransitionTime":"2025-10-01T10:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:47 crc kubenswrapper[4735]: I1001 10:18:47.911739 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.004219 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.004249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.004257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.004271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.004281 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:48Z","lastTransitionTime":"2025-10-01T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.107640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.107720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.107737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.107761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.107778 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:48Z","lastTransitionTime":"2025-10-01T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.210175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.210259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.210273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.210318 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.210333 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:48Z","lastTransitionTime":"2025-10-01T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.312993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.313064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.313080 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.313102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.313118 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:48Z","lastTransitionTime":"2025-10-01T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.416561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.416645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.416682 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.416715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.416735 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:48Z","lastTransitionTime":"2025-10-01T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.519881 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.519959 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.519993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.520023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.520044 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:48Z","lastTransitionTime":"2025-10-01T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.622330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.622370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.622380 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.622394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.622414 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:48Z","lastTransitionTime":"2025-10-01T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.725403 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.725486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.725548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.725579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.725601 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:48Z","lastTransitionTime":"2025-10-01T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.828135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.828199 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.828218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.828241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.828257 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:48Z","lastTransitionTime":"2025-10-01T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.896201 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.896236 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.896249 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:48 crc kubenswrapper[4735]: E1001 10:18:48.896325 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.896344 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:48 crc kubenswrapper[4735]: E1001 10:18:48.896426 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:48 crc kubenswrapper[4735]: E1001 10:18:48.896577 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:48 crc kubenswrapper[4735]: E1001 10:18:48.896699 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.929910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.929956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.929968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.929984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:48 crc kubenswrapper[4735]: I1001 10:18:48.929995 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:48Z","lastTransitionTime":"2025-10-01T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.032264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.032299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.032308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.032322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.032330 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:49Z","lastTransitionTime":"2025-10-01T10:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.134561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.134641 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.134674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.134702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.134723 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:49Z","lastTransitionTime":"2025-10-01T10:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.236914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.236958 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.236969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.236998 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.237009 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:49Z","lastTransitionTime":"2025-10-01T10:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.338802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.338860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.338879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.338901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.338919 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:49Z","lastTransitionTime":"2025-10-01T10:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.441895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.441969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.441992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.442021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.442044 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:49Z","lastTransitionTime":"2025-10-01T10:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.545183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.545239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.545254 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.545277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.545291 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:49Z","lastTransitionTime":"2025-10-01T10:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.649069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.649134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.649151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.649180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.649198 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:49Z","lastTransitionTime":"2025-10-01T10:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.752900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.752971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.752992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.753022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.753042 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:49Z","lastTransitionTime":"2025-10-01T10:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.856672 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.856736 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.856754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.856782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.856802 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:49Z","lastTransitionTime":"2025-10-01T10:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.959759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.959809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.959821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.959838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:49 crc kubenswrapper[4735]: I1001 10:18:49.959849 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:49Z","lastTransitionTime":"2025-10-01T10:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.062989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.063056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.063071 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.063096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.063114 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:50Z","lastTransitionTime":"2025-10-01T10:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.167538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.167590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.167600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.167623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.167639 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:50Z","lastTransitionTime":"2025-10-01T10:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.270833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.270878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.270889 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.270904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.270915 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:50Z","lastTransitionTime":"2025-10-01T10:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.374171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.374280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.374302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.374338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.374359 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:50Z","lastTransitionTime":"2025-10-01T10:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.477809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.477873 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.477888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.477934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.477960 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:50Z","lastTransitionTime":"2025-10-01T10:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.580636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.580689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.580705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.580726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.580741 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:50Z","lastTransitionTime":"2025-10-01T10:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.683653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.683701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.683714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.683733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.683744 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:50Z","lastTransitionTime":"2025-10-01T10:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.787439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.787505 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.787516 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.787532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.787542 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:50Z","lastTransitionTime":"2025-10-01T10:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.891107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.891160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.891168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.891190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.891202 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:50Z","lastTransitionTime":"2025-10-01T10:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.896301 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.896315 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.896354 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.896436 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:50 crc kubenswrapper[4735]: E1001 10:18:50.896551 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:50 crc kubenswrapper[4735]: E1001 10:18:50.896667 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:50 crc kubenswrapper[4735]: E1001 10:18:50.897107 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:50 crc kubenswrapper[4735]: E1001 10:18:50.897222 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.897712 4735 scope.go:117] "RemoveContainer" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" Oct 01 10:18:50 crc kubenswrapper[4735]: E1001 10:18:50.897968 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.995419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.995463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.995471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.995486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:50 crc kubenswrapper[4735]: I1001 10:18:50.995513 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:50Z","lastTransitionTime":"2025-10-01T10:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.097352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.097397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.097406 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.097420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.097432 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:51Z","lastTransitionTime":"2025-10-01T10:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.200616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.200675 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.200685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.200700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.200710 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:51Z","lastTransitionTime":"2025-10-01T10:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.304283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.304364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.304388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.304421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.304442 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:51Z","lastTransitionTime":"2025-10-01T10:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.407640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.407696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.407705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.407726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.407738 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:51Z","lastTransitionTime":"2025-10-01T10:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.510532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.510582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.510599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.510621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.510636 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:51Z","lastTransitionTime":"2025-10-01T10:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.612559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.612607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.612622 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.612642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.612659 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:51Z","lastTransitionTime":"2025-10-01T10:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.715900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.715954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.715968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.715985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.715998 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:51Z","lastTransitionTime":"2025-10-01T10:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.818245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.818308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.818538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.818567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.818585 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:51Z","lastTransitionTime":"2025-10-01T10:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.915456 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.922093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.922150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.922164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.922183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.922196 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:51Z","lastTransitionTime":"2025-10-01T10:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.938818 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.961997 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.977926 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:51 crc kubenswrapper[4735]: I1001 10:18:51.994952 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:51Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.012358 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.023669 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.024743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.024792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.024807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.024824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.024834 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.039243 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.063784 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc2a246-afd6-463c-a5f6-2099cc399b20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc67e1102794df69d3a244c5c84544734e638fa9961c9215aad48b3378a5b49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609ae492f258f9461082bee85100a7cdb74c1f53a10677141fdd2e98e10c0fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da221f065e1113df91f511554df0ea4ec4615b1fa4a86c28416e59baf7abc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82af6aa8cd80d0817a1ecefb6b240b36ed284906f9ad6bc1fe1a2c176d0a5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://271515a0b5291da42aca4d2d02fa7cb39bb09248693d575c21ac1607f67d8926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8e0722008bed37a96ef54cbcd8129d5df9559b709b954910460a9426703a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8e0722008bed37a96ef54cbcd8129d5df9559b709b954910460a9426703a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e141ae3d66c0fc869c1f603084e06bfa2d42b5ac6f23d84a349dd50ad4c424d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e141ae3d66c0fc869c1f603084e06bfa2d42b5ac6f23d84a349dd50ad4c424d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff1808c3b0f4dfc3f8ad74de5eb69d61e10a5892a375ed5d6ededec934f6dfae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff1808c3b0f4dfc3f8ad74de5eb69d61e10a5892a375ed5d6ededec934f6dfae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.076187 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.087095 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2be33635-a388-4c92-a1e6-9a0736dbbe79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07aa8a6d6d8611abb4a1531fc38c8bbdca73cf532d2f45645fcfb0eaeb92fec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e93060fbaf012ed4f8aec1624caf86aa986cea91a7ba97b7fb1dffa87343092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accb544db17576f62271088d4e35c414c0ed291197cf7d953466d7634edff0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.094746 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.111511 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:33Z\\\",\\\"message\\\":\\\"001 10:18:33.600387 6760 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:18:33.600397 6760 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default\\\\nI1001 10:18:33.600387 6760 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 10:18:33.600409 6760 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 10:18:33.600424 6760 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:18:33.600455 6760 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 10:18:33.600517 6760 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.122986 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.126545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.126577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.126588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.126605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.126616 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.136220 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.150811 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.165466 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:30Z\\\",\\\"message\\\":\\\"2025-10-01T10:17:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93\\\\n2025-10-01T10:17:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93 to /host/opt/cni/bin/\\\\n2025-10-01T10:17:45Z [verbose] multus-daemon started\\\\n2025-10-01T10:17:45Z [verbose] Readiness Indicator file check\\\\n2025-10-01T10:18:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:18:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.175371 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.230338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.230388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.230400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.230418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.230432 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.332885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.332929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.332943 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.332960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.332972 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.435313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.435346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.435354 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.435367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.435377 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.538319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.538371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.538382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.538396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.538405 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.640899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.640939 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.640947 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.640960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.640970 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.743368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.743421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.743439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.743483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.743517 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.826614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.826654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.826665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.826680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.826691 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: E1001 10:18:52.844984 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.849327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.849353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.849364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.849378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.849387 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: E1001 10:18:52.865038 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.868804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.868845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.868861 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.868886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.868901 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: E1001 10:18:52.881134 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.885109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.885138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.885148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.885160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.885171 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.896263 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:52 crc kubenswrapper[4735]: E1001 10:18:52.896353 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:52 crc kubenswrapper[4735]: E1001 10:18:52.896308 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.896429 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.896545 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:52 crc kubenswrapper[4735]: E1001 10:18:52.896655 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.896687 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:52 crc kubenswrapper[4735]: E1001 10:18:52.896841 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:52 crc kubenswrapper[4735]: E1001 10:18:52.897028 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.901951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.901981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.901990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.902003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.902017 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:52 crc kubenswrapper[4735]: E1001 10:18:52.914313 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2d0b3ffa-b2c4-4f1e-8072-c9d7f87a3270\\\",\\\"systemUUID\\\":\\\"f44ec9f4-8a46-42ac-97b9-caabc07abc52\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:18:52Z is after 2025-08-24T17:21:41Z" Oct 01 10:18:52 crc kubenswrapper[4735]: E1001 10:18:52.914584 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.918052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.918128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.918154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.918175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:52 crc kubenswrapper[4735]: I1001 10:18:52.918191 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:52Z","lastTransitionTime":"2025-10-01T10:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.020904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.020941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.020953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.020972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.020984 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:53Z","lastTransitionTime":"2025-10-01T10:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.123514 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.123564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.123576 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.123594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.123606 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:53Z","lastTransitionTime":"2025-10-01T10:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.227233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.227326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.227352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.227393 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.227420 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:53Z","lastTransitionTime":"2025-10-01T10:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.330642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.330687 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.330700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.330725 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.330745 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:53Z","lastTransitionTime":"2025-10-01T10:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.434168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.434237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.434252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.434274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.434289 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:53Z","lastTransitionTime":"2025-10-01T10:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.538062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.538144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.538170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.538200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.538229 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:53Z","lastTransitionTime":"2025-10-01T10:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.641464 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.641559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.641578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.641605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.641624 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:53Z","lastTransitionTime":"2025-10-01T10:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.745230 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.745284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.745296 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.745314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.745326 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:53Z","lastTransitionTime":"2025-10-01T10:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.848619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.848699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.848711 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.848730 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.848742 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:53Z","lastTransitionTime":"2025-10-01T10:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.911595 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.951562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.951610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.951621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.951638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:53 crc kubenswrapper[4735]: I1001 10:18:53.951650 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:53Z","lastTransitionTime":"2025-10-01T10:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.054086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.054147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.054160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.054182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.054195 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:54Z","lastTransitionTime":"2025-10-01T10:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.156988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.157038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.157049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.157066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.157077 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:54Z","lastTransitionTime":"2025-10-01T10:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.260108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.260180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.260196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.260219 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.260234 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:54Z","lastTransitionTime":"2025-10-01T10:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.362394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.362448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.362458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.362474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.362485 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:54Z","lastTransitionTime":"2025-10-01T10:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.465602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.465649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.465658 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.465673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.465683 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:54Z","lastTransitionTime":"2025-10-01T10:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.569104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.569182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.569207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.569236 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.569252 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:54Z","lastTransitionTime":"2025-10-01T10:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.673620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.673717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.673745 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.673782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.673808 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:54Z","lastTransitionTime":"2025-10-01T10:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.777110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.777192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.777217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.777254 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.777281 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:54Z","lastTransitionTime":"2025-10-01T10:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.880345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.880386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.880398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.880415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.880427 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:54Z","lastTransitionTime":"2025-10-01T10:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.896253 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.896334 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.896266 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.896438 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:54 crc kubenswrapper[4735]: E1001 10:18:54.896396 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:54 crc kubenswrapper[4735]: E1001 10:18:54.896705 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:54 crc kubenswrapper[4735]: E1001 10:18:54.896794 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:54 crc kubenswrapper[4735]: E1001 10:18:54.896856 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.982870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.982944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.982965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.982989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:54 crc kubenswrapper[4735]: I1001 10:18:54.983006 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:54Z","lastTransitionTime":"2025-10-01T10:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.085203 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.085232 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.085241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.085254 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.085263 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:55Z","lastTransitionTime":"2025-10-01T10:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.189990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.190056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.190077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.190104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.190122 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:55Z","lastTransitionTime":"2025-10-01T10:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.292914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.292964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.292976 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.292996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.293008 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:55Z","lastTransitionTime":"2025-10-01T10:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.395264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.395301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.395314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.395328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.395338 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:55Z","lastTransitionTime":"2025-10-01T10:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.497620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.497722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.497744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.497772 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.497790 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:55Z","lastTransitionTime":"2025-10-01T10:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.600910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.601034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.601057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.601085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.601155 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:55Z","lastTransitionTime":"2025-10-01T10:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.702930 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.702975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.702989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.703005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.703016 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:55Z","lastTransitionTime":"2025-10-01T10:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.806570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.806666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.806695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.806718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.806733 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:55Z","lastTransitionTime":"2025-10-01T10:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.909719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.909749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.909757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.909769 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:55 crc kubenswrapper[4735]: I1001 10:18:55.909779 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:55Z","lastTransitionTime":"2025-10-01T10:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.012264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.012300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.012311 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.012330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.012344 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:56Z","lastTransitionTime":"2025-10-01T10:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.115005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.115049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.115058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.115070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.115079 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:56Z","lastTransitionTime":"2025-10-01T10:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.217368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.217630 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.217643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.217659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.217670 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:56Z","lastTransitionTime":"2025-10-01T10:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.319708 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.319749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.319760 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.319776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.319788 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:56Z","lastTransitionTime":"2025-10-01T10:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.422233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.422288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.422301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.422320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.422335 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:56Z","lastTransitionTime":"2025-10-01T10:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.524531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.524571 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.524582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.524597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.524608 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:56Z","lastTransitionTime":"2025-10-01T10:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.627485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.627557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.627568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.627583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.627594 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:56Z","lastTransitionTime":"2025-10-01T10:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.730274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.730305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.730313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.730324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.730332 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:56Z","lastTransitionTime":"2025-10-01T10:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.832726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.832809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.832821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.832840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.832853 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:56Z","lastTransitionTime":"2025-10-01T10:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.895939 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.896012 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.896032 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.895965 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:56 crc kubenswrapper[4735]: E1001 10:18:56.896116 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:56 crc kubenswrapper[4735]: E1001 10:18:56.896254 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:56 crc kubenswrapper[4735]: E1001 10:18:56.896352 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:56 crc kubenswrapper[4735]: E1001 10:18:56.896441 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.935355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.935439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.935468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.935552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:56 crc kubenswrapper[4735]: I1001 10:18:56.935575 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:56Z","lastTransitionTime":"2025-10-01T10:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.038544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.038601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.038617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.038644 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.038673 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:57Z","lastTransitionTime":"2025-10-01T10:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.141536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.141593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.141603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.141619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.141630 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:57Z","lastTransitionTime":"2025-10-01T10:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.244977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.245027 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.245038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.245054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.245064 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:57Z","lastTransitionTime":"2025-10-01T10:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.348060 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.348091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.348122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.348137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.348147 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:57Z","lastTransitionTime":"2025-10-01T10:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.450857 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.451158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.451282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.451387 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.451520 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:57Z","lastTransitionTime":"2025-10-01T10:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.553938 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.554001 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.554016 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.554042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.554065 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:57Z","lastTransitionTime":"2025-10-01T10:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.656015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.656057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.656068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.656083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.656094 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:57Z","lastTransitionTime":"2025-10-01T10:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.758407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.758759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.758769 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.758783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.758796 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:57Z","lastTransitionTime":"2025-10-01T10:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.860934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.860967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.860977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.860992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.861002 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:57Z","lastTransitionTime":"2025-10-01T10:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.963054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.963639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.963667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.963683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:57 crc kubenswrapper[4735]: I1001 10:18:57.963695 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:57Z","lastTransitionTime":"2025-10-01T10:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.066071 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.066106 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.066115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.066130 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.066143 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:58Z","lastTransitionTime":"2025-10-01T10:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.168389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.168431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.168441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.168466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.168477 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:58Z","lastTransitionTime":"2025-10-01T10:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.270483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.270537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.270549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.270564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.270576 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:58Z","lastTransitionTime":"2025-10-01T10:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.372867 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.372906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.372914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.372927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.372937 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:58Z","lastTransitionTime":"2025-10-01T10:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.475642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.475678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.475687 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.475702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.475711 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:58Z","lastTransitionTime":"2025-10-01T10:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.577952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.577996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.578008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.578023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.578035 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:58Z","lastTransitionTime":"2025-10-01T10:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.680082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.680173 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.680193 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.680213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.680224 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:58Z","lastTransitionTime":"2025-10-01T10:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.782913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.782975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.782987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.783008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.783023 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:58Z","lastTransitionTime":"2025-10-01T10:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.885537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.885573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.885582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.885595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.885604 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:58Z","lastTransitionTime":"2025-10-01T10:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.896668 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.896756 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.896849 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.896996 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:18:58 crc kubenswrapper[4735]: E1001 10:18:58.897005 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:18:58 crc kubenswrapper[4735]: E1001 10:18:58.897065 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:18:58 crc kubenswrapper[4735]: E1001 10:18:58.897243 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:18:58 crc kubenswrapper[4735]: E1001 10:18:58.897349 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.989301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.989353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.989364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.989381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:58 crc kubenswrapper[4735]: I1001 10:18:58.989398 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:58Z","lastTransitionTime":"2025-10-01T10:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.093134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.093186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.093198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.093220 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.093239 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:59Z","lastTransitionTime":"2025-10-01T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.196829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.196871 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.196880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.196912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.196925 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:59Z","lastTransitionTime":"2025-10-01T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.238539 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:18:59 crc kubenswrapper[4735]: E1001 10:18:59.238802 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:18:59 crc kubenswrapper[4735]: E1001 10:18:59.238921 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs podName:77b56a8b-1a27-4727-b45e-43fbc3847ddd nodeName:}" failed. No retries permitted until 2025-10-01 10:20:03.238889312 +0000 UTC m=+161.931710614 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs") pod "network-metrics-daemon-qm6mr" (UID: "77b56a8b-1a27-4727-b45e-43fbc3847ddd") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.300123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.300206 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.300225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.300253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.300275 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:59Z","lastTransitionTime":"2025-10-01T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.403754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.403807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.403816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.403835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.403846 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:59Z","lastTransitionTime":"2025-10-01T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.506920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.506990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.507003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.507031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.507051 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:59Z","lastTransitionTime":"2025-10-01T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.609396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.609430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.609439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.609453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.609462 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:59Z","lastTransitionTime":"2025-10-01T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.712301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.712342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.712354 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.712371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.712385 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:59Z","lastTransitionTime":"2025-10-01T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.815039 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.815073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.815085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.815099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.815107 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:59Z","lastTransitionTime":"2025-10-01T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.917474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.917555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.917567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.917579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:18:59 crc kubenswrapper[4735]: I1001 10:18:59.917591 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:18:59Z","lastTransitionTime":"2025-10-01T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.020102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.020141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.020149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.020163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.020173 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:00Z","lastTransitionTime":"2025-10-01T10:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.122669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.122709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.122723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.122738 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.122748 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:00Z","lastTransitionTime":"2025-10-01T10:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.224549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.224594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.224604 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.224617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.224626 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:00Z","lastTransitionTime":"2025-10-01T10:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.326785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.326823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.326862 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.326879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.326888 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:00Z","lastTransitionTime":"2025-10-01T10:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.428619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.428656 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.428682 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.428696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.428705 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:00Z","lastTransitionTime":"2025-10-01T10:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.530951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.530980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.530988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.531002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.531012 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:00Z","lastTransitionTime":"2025-10-01T10:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.633712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.633772 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.633783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.633801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.633833 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:00Z","lastTransitionTime":"2025-10-01T10:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.737470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.737561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.737575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.737598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.737617 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:00Z","lastTransitionTime":"2025-10-01T10:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.840728 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.840768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.840776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.840791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.840801 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:00Z","lastTransitionTime":"2025-10-01T10:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.895937 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.896030 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:00 crc kubenswrapper[4735]: E1001 10:19:00.896280 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.896427 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:00 crc kubenswrapper[4735]: E1001 10:19:00.896556 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.896427 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:00 crc kubenswrapper[4735]: E1001 10:19:00.896735 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:19:00 crc kubenswrapper[4735]: E1001 10:19:00.896789 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.944599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.944655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.944666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.944691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:00 crc kubenswrapper[4735]: I1001 10:19:00.944703 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:00Z","lastTransitionTime":"2025-10-01T10:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.048722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.048777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.048790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.048807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.048817 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:01Z","lastTransitionTime":"2025-10-01T10:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.151907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.151951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.151960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.151975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.151989 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:01Z","lastTransitionTime":"2025-10-01T10:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.254226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.254298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.254320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.254343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.254356 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:01Z","lastTransitionTime":"2025-10-01T10:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.356702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.356754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.356764 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.356778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.356789 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:01Z","lastTransitionTime":"2025-10-01T10:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.460183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.460229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.460238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.460252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.460262 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:01Z","lastTransitionTime":"2025-10-01T10:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.563380 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.563432 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.563443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.563459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.563468 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:01Z","lastTransitionTime":"2025-10-01T10:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.666632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.666675 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.666688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.666773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.666793 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:01Z","lastTransitionTime":"2025-10-01T10:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.769202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.769242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.769251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.769264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.769274 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:01Z","lastTransitionTime":"2025-10-01T10:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.872015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.872108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.872125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.872150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.872166 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:01Z","lastTransitionTime":"2025-10-01T10:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.915962 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ad8557c7f8854ba00bcc802e1a0d427b9dc7dfa936c02f56bb70e5c991d9039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.933651 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.956096 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63cf5011-57e9-44fa-a662-f30391ef1ff8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98eb54784a97eb54cbe3bff224f952baab0bcf996cf032f3a81d336b8aa60d51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://440a656da35c46cc9908c42c9345bf186f6f4bf8b1049072ae17d44a90ab84f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de7e6dc67949ed69e6593637be6b0c8936520613e790bb58b5b81ad5da658ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33c04c198ca615b8438ab935ac01a70e05c559cb4d67e01c6bcc0cdbea1e70f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f2247750a2b94ee52ff6b3af9ca9611fd7b8132f6400ad6ba5be0aef5e8a52c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364d6ffc5a86119509e33d1aefd0aa1398c5991787772cb6f653819d6619a481\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e781871a002348515bfabc3663ae660e5b0b7510214281a49eac2847b8c7eab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntcth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6qlsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.970325 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77b56a8b-1a27-4727-b45e-43fbc3847ddd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sffkg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qm6mr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.975360 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.975392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.975403 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.975419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.975430 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:01Z","lastTransitionTime":"2025-10-01T10:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:01 crc kubenswrapper[4735]: I1001 10:19:01.984924 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cee65f2-f6a2-4fc7-bb2c-3268d7665f7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a321a308d85e66b54cf5a08968c8fa2b856b42b021af3ab8cdccbb8c7ac03652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6bb5b2bdb9bd05c3d9a137553994d6169c1a24b1719c3a66dcef2b45678a78b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6bb5b2bdb9bd05c3d9a137553994d6169c1a24b1719c3a66dcef2b45678a78b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:01Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.009716 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fc2a246-afd6-463c-a5f6-2099cc399b20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc67e1102794df69d3a244c5c84544734e638fa9961c9215aad48b3378a5b49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609ae492f258f9461082bee85100a7cdb74c1f53a10677141fdd2e98e10c0fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da221f065e1113df91f511554df0ea4ec4615b1fa4a86c28416e59baf7abc85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e82af6aa8cd80d0817a1ecefb6b240b36ed284906f9ad6bc1fe1a2c176d0a5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://271515a0b5291da42aca4d2d02fa7cb39bb09248693d575c21ac1607f67d8926\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a8e0722008bed37a96ef54cbcd8129d5df9559b709b954910460a9426703a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a8e0722008bed37a96ef54cbcd8129d5df9559b709b954910460a9426703a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e141ae3d66c0fc869c1f603084e06bfa2d42b5ac6f23d84a349dd50ad4c424d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e141ae3d66c0fc869c1f603084e06bfa2d42b5ac6f23d84a349dd50ad4c424d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ff1808c3b0f4dfc3f8ad74de5eb69d61e10a5892a375ed5d6ededec934f6dfae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff1808c3b0f4dfc3f8ad74de5eb69d61e10a5892a375ed5d6ededec934f6dfae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.024633 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f3e95acd-3b2e-4470-b2eb-cc0b1ae8d0a3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bb498640a806273bb1562957c1c070309079e934984da9fd0e2b29df54de35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5bc4c03105041c42aab1f2544ebc68e4c8f276c34229964b68bf3ed63b2edbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc06a802fc239c939ebf50add628e0741428cd5874ff858ae192cc9d782e5e7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a049e79794f886b6622d898171e4e83753352f445e8c600665dc73b8adee3754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.044179 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2be33635-a388-4c92-a1e6-9a0736dbbe79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07aa8a6d6d8611abb4a1531fc38c8bbdca73cf532d2f45645fcfb0eaeb92fec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e93060fbaf012ed4f8aec1624caf86aa986cea91a7ba97b7fb1dffa87343092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://accb544db17576f62271088d4e35c414c0ed291197cf7d953466d7634edff0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cb0cece7c30c3fbd18adeb8daa644b951b45a159308fa26eec931f95f0ac881\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.055764 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5n9cx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17f1afd4-6acd-4c1c-adb3-e60c2ade4aa9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4084815fe5b8ee63b464f56d3a7bda3b5cd358b8b9f8e361e723ab2a3f4daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jpxch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:40Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5n9cx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.074347 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32f1531f-5034-48d4-b694-efc774226e37\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:33Z\\\",\\\"message\\\":\\\"001 10:18:33.600387 6760 ovnkube.go:599] Stopped ovnkube\\\\nI1001 10:18:33.600397 6760 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/catalog-operator-metrics for network=default\\\\nI1001 10:18:33.600387 6760 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 10:18:33.600409 6760 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 10:18:33.600424 6760 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1001 10:18:33.600455 6760 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1001 10:18:33.600517 6760 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:18:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqpq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-k5mgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.077615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.077643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.077651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.077666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.077675 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:02Z","lastTransitionTime":"2025-10-01T10:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.090422 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"358434f3-ecfd-4a3e-85d4-0a5d3d12d4d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9b48b18fd3ad59b9decc42e44539f17f32fb5d6c811e3d2beda91071fc76fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbe000aa1f2ef04cd49dd91e7ec18f7f040623e9cf2f86354c8ae6fbab575080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2nqt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:54Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zcvzj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.104149 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.115547 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99b7f57dd32b2c79d3035f89c98ea9e1370ebceedd973ca53516ade0cf23937c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1a21666ef1eef1960e0d68966f5093b9cd8ee131cd68e535e6d91f22ae5edf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.127970 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d9eca4874ced0d5dfbbda626d823a61f79992ea877b085814b2d2c9104ad3a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.141852 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dz9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:18:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T10:18:30Z\\\",\\\"message\\\":\\\"2025-10-01T10:17:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93\\\\n2025-10-01T10:17:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4e070c81-c0cb-4c53-bad9-c116c63efe93 to /host/opt/cni/bin/\\\\n2025-10-01T10:17:45Z [verbose] multus-daemon started\\\\n2025-10-01T10:17:45Z [verbose] Readiness Indicator file check\\\\n2025-10-01T10:18:30Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:18:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6hlq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dz9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.152025 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zdmp4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14eb0416-c94e-4b5c-824e-720abd2fe3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b331af5f1d4f4b11c27d1c2285e760a27f9af30ff7fba24c2b73b63385f1e9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prqcf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:44Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zdmp4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.165193 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"57c7ac5a-4f06-4bc8-a5a5-2847a333a6a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646c0cb0fe81ca4e3eda837dc9ec9120602d4913e9a9b0e1a7b147eda927b4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1463174f3c2e0b23753621677c04f87b3e1ace5bd1e9db2430d0f59941a89d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be5c8a6f5edfaf1d7ed78df4e12bd18461ddc9be5bc8d2f4a1392a773bc49cba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c18d553510ce4b94c304b9ab7c8c83d08a80852caad26565d389b929307e957\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7310fd0dd82ef03796ffbb1d48ed87d0b160177614280aa5f7d7a214c89ddb15\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 10:17:35.450660 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 10:17:35.453103 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3276380854/tls.crt::/tmp/serving-cert-3276380854/tls.key\\\\\\\"\\\\nI1001 10:17:40.914170 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 10:17:40.917410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 10:17:40.917435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 10:17:40.917458 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 10:17:40.917465 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 10:17:40.931938 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 10:17:40.931968 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931974 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 10:17:40.931981 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 10:17:40.931987 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 10:17:40.931993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 10:17:40.931996 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 10:17:40.932000 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 10:17:40.932933 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7e21eaa267ba91273db23b50f35954ecbbfdb07155372169b5799b1f8a872b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6807d00a2d35095566ad8b6f2b895c7f15f9441af2d7035b4107d756110240\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T10:17:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T10:17:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.178761 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.179548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.179599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.179610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.179631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.179646 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:02Z","lastTransitionTime":"2025-10-01T10:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.191918 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c2fdbf0-2469-4ca0-8624-d63609123cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T10:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b444968cd2dd7c27e3ab5de010228bdb6aca03388c8bf326ff5dc5b65d59a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T10:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr2m9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T10:17:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xgg24\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T10:19:02Z is after 2025-08-24T17:21:41Z" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.281941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.281972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.281980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.281996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.282006 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:02Z","lastTransitionTime":"2025-10-01T10:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.384188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.384228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.384237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.384252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.384264 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:02Z","lastTransitionTime":"2025-10-01T10:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.486178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.486210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.486218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.486231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.486241 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:02Z","lastTransitionTime":"2025-10-01T10:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.589559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.589611 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.589629 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.589655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.589672 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:02Z","lastTransitionTime":"2025-10-01T10:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.691888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.691936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.691967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.691979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.691987 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:02Z","lastTransitionTime":"2025-10-01T10:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.794304 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.794342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.794350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.794363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.794373 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:02Z","lastTransitionTime":"2025-10-01T10:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.895994 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.896037 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.896079 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.896006 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:02 crc kubenswrapper[4735]: E1001 10:19:02.896219 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:19:02 crc kubenswrapper[4735]: E1001 10:19:02.896320 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:19:02 crc kubenswrapper[4735]: E1001 10:19:02.896445 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:19:02 crc kubenswrapper[4735]: E1001 10:19:02.896704 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.897465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.897526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.897540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.897557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:02 crc kubenswrapper[4735]: I1001 10:19:02.897569 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:02Z","lastTransitionTime":"2025-10-01T10:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.001070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.001123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.001136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.001156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.001168 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:03Z","lastTransitionTime":"2025-10-01T10:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.103849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.103907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.103916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.103931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.103940 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:03Z","lastTransitionTime":"2025-10-01T10:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.174928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.174989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.175000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.175015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.175027 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T10:19:03Z","lastTransitionTime":"2025-10-01T10:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.276064 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv"] Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.276413 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.278671 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.278806 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.278969 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.279802 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.315951 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6qlsd" podStartSLOduration=82.315928863 podStartE2EDuration="1m22.315928863s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:03.30496689 +0000 UTC m=+101.997788152" watchObservedRunningTime="2025-10-01 10:19:03.315928863 +0000 UTC m=+102.008750125" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.377242 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=79.377218885 podStartE2EDuration="1m19.377218885s" podCreationTimestamp="2025-10-01 10:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:03.377067791 +0000 UTC m=+102.069889053" watchObservedRunningTime="2025-10-01 10:19:03.377218885 +0000 UTC m=+102.070040147" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.377563 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=16.377558004 podStartE2EDuration="16.377558004s" podCreationTimestamp="2025-10-01 10:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:03.361150825 +0000 UTC m=+102.053972087" watchObservedRunningTime="2025-10-01 10:19:03.377558004 +0000 UTC m=+102.070379256" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.384548 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/23bfb529-04c3-43ed-ba37-0d1e292a461d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.384605 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23bfb529-04c3-43ed-ba37-0d1e292a461d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.384629 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/23bfb529-04c3-43ed-ba37-0d1e292a461d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.384720 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23bfb529-04c3-43ed-ba37-0d1e292a461d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.384795 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23bfb529-04c3-43ed-ba37-0d1e292a461d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.388845 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.388834316 podStartE2EDuration="48.388834316s" podCreationTimestamp="2025-10-01 10:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:03.388564138 +0000 UTC m=+102.081385400" watchObservedRunningTime="2025-10-01 10:19:03.388834316 +0000 UTC m=+102.081655578" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.419309 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5n9cx" podStartSLOduration=83.419289341 podStartE2EDuration="1m23.419289341s" podCreationTimestamp="2025-10-01 10:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:03.39831406 +0000 UTC m=+102.091135322" watchObservedRunningTime="2025-10-01 10:19:03.419289341 +0000 UTC m=+102.112110603" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.433717 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zcvzj" podStartSLOduration=81.433699218 podStartE2EDuration="1m21.433699218s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:03.433668237 +0000 UTC m=+102.126489519" watchObservedRunningTime="2025-10-01 10:19:03.433699218 +0000 UTC m=+102.126520480" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.442250 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=10.442231327 podStartE2EDuration="10.442231327s" podCreationTimestamp="2025-10-01 10:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:03.442065122 +0000 UTC m=+102.134886384" watchObservedRunningTime="2025-10-01 10:19:03.442231327 +0000 UTC m=+102.135052589" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.485972 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/23bfb529-04c3-43ed-ba37-0d1e292a461d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.486040 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23bfb529-04c3-43ed-ba37-0d1e292a461d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.486081 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23bfb529-04c3-43ed-ba37-0d1e292a461d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.486110 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/23bfb529-04c3-43ed-ba37-0d1e292a461d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.486118 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/23bfb529-04c3-43ed-ba37-0d1e292a461d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.486159 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23bfb529-04c3-43ed-ba37-0d1e292a461d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.486453 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/23bfb529-04c3-43ed-ba37-0d1e292a461d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.487235 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23bfb529-04c3-43ed-ba37-0d1e292a461d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.488321 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8dz9b" podStartSLOduration=82.488307991 podStartE2EDuration="1m22.488307991s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:03.478120547 +0000 UTC m=+102.170941819" watchObservedRunningTime="2025-10-01 10:19:03.488307991 +0000 UTC m=+102.181129253" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.493020 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23bfb529-04c3-43ed-ba37-0d1e292a461d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.500672 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zdmp4" podStartSLOduration=82.500653881 podStartE2EDuration="1m22.500653881s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:03.488027393 +0000 UTC m=+102.180848655" watchObservedRunningTime="2025-10-01 10:19:03.500653881 +0000 UTC m=+102.193475143" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.506898 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23bfb529-04c3-43ed-ba37-0d1e292a461d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lcqkv\" (UID: \"23bfb529-04c3-43ed-ba37-0d1e292a461d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.527039 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podStartSLOduration=82.527015487 podStartE2EDuration="1m22.527015487s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:03.526936895 +0000 UTC m=+102.219758157" watchObservedRunningTime="2025-10-01 10:19:03.527015487 +0000 UTC m=+102.219836749" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.545663 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.545646607 podStartE2EDuration="1m22.545646607s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:03.544960928 +0000 UTC m=+102.237782200" watchObservedRunningTime="2025-10-01 10:19:03.545646607 +0000 UTC m=+102.238467859" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.590432 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" Oct 01 10:19:03 crc kubenswrapper[4735]: I1001 10:19:03.897865 4735 scope.go:117] "RemoveContainer" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" Oct 01 10:19:03 crc kubenswrapper[4735]: E1001 10:19:03.898083 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-k5mgz_openshift-ovn-kubernetes(32f1531f-5034-48d4-b694-efc774226e37)\"" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" Oct 01 10:19:04 crc kubenswrapper[4735]: I1001 10:19:04.372266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" event={"ID":"23bfb529-04c3-43ed-ba37-0d1e292a461d","Type":"ContainerStarted","Data":"2f60f79c2ed1843caac5a15c3d0c5f42703e774c82b52436eb11aa73994f64c1"} Oct 01 10:19:04 crc kubenswrapper[4735]: I1001 10:19:04.372327 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" event={"ID":"23bfb529-04c3-43ed-ba37-0d1e292a461d","Type":"ContainerStarted","Data":"e84a7fa714993b6db7012cf4fab2f0a64b66a0de1caa5c5dae6802ce46dc184c"} Oct 01 10:19:04 crc kubenswrapper[4735]: I1001 10:19:04.895902 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:19:04 crc kubenswrapper[4735]: I1001 10:19:04.895914 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:04 crc kubenswrapper[4735]: I1001 10:19:04.895916 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:04 crc kubenswrapper[4735]: I1001 10:19:04.896207 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:04 crc kubenswrapper[4735]: E1001 10:19:04.896308 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:19:04 crc kubenswrapper[4735]: E1001 10:19:04.896399 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:19:04 crc kubenswrapper[4735]: E1001 10:19:04.896592 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:19:04 crc kubenswrapper[4735]: E1001 10:19:04.896672 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:19:06 crc kubenswrapper[4735]: I1001 10:19:06.896705 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:06 crc kubenswrapper[4735]: I1001 10:19:06.896748 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:19:06 crc kubenswrapper[4735]: E1001 10:19:06.896917 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:19:06 crc kubenswrapper[4735]: I1001 10:19:06.896950 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:06 crc kubenswrapper[4735]: E1001 10:19:06.897307 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:19:06 crc kubenswrapper[4735]: E1001 10:19:06.897150 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:19:06 crc kubenswrapper[4735]: I1001 10:19:06.897025 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:06 crc kubenswrapper[4735]: E1001 10:19:06.897561 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:19:08 crc kubenswrapper[4735]: I1001 10:19:08.896456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:08 crc kubenswrapper[4735]: I1001 10:19:08.896456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:08 crc kubenswrapper[4735]: I1001 10:19:08.896529 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:19:08 crc kubenswrapper[4735]: I1001 10:19:08.896534 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:08 crc kubenswrapper[4735]: E1001 10:19:08.896623 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:19:08 crc kubenswrapper[4735]: E1001 10:19:08.896668 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:19:08 crc kubenswrapper[4735]: E1001 10:19:08.896738 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:19:08 crc kubenswrapper[4735]: E1001 10:19:08.896776 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:19:10 crc kubenswrapper[4735]: I1001 10:19:10.896699 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:10 crc kubenswrapper[4735]: I1001 10:19:10.896813 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:19:10 crc kubenswrapper[4735]: I1001 10:19:10.896740 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:10 crc kubenswrapper[4735]: I1001 10:19:10.896699 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:10 crc kubenswrapper[4735]: E1001 10:19:10.896903 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:19:10 crc kubenswrapper[4735]: E1001 10:19:10.896986 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:19:10 crc kubenswrapper[4735]: E1001 10:19:10.897089 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:19:10 crc kubenswrapper[4735]: E1001 10:19:10.897152 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:19:12 crc kubenswrapper[4735]: I1001 10:19:12.896485 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:12 crc kubenswrapper[4735]: I1001 10:19:12.896506 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:12 crc kubenswrapper[4735]: I1001 10:19:12.896524 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:19:12 crc kubenswrapper[4735]: I1001 10:19:12.896588 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:12 crc kubenswrapper[4735]: E1001 10:19:12.896677 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:19:12 crc kubenswrapper[4735]: E1001 10:19:12.896777 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:19:12 crc kubenswrapper[4735]: E1001 10:19:12.896829 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:19:12 crc kubenswrapper[4735]: E1001 10:19:12.896889 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:19:14 crc kubenswrapper[4735]: I1001 10:19:14.896859 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:14 crc kubenswrapper[4735]: I1001 10:19:14.896879 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:14 crc kubenswrapper[4735]: I1001 10:19:14.896943 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:19:14 crc kubenswrapper[4735]: I1001 10:19:14.897053 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:14 crc kubenswrapper[4735]: E1001 10:19:14.897189 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:19:14 crc kubenswrapper[4735]: E1001 10:19:14.897808 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:19:14 crc kubenswrapper[4735]: E1001 10:19:14.898223 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:19:14 crc kubenswrapper[4735]: E1001 10:19:14.898396 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:19:14 crc kubenswrapper[4735]: I1001 10:19:14.898598 4735 scope.go:117] "RemoveContainer" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" Oct 01 10:19:15 crc kubenswrapper[4735]: I1001 10:19:15.402921 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/3.log" Oct 01 10:19:15 crc kubenswrapper[4735]: I1001 10:19:15.406119 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerStarted","Data":"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2"} Oct 01 10:19:15 crc kubenswrapper[4735]: I1001 10:19:15.406720 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:19:15 crc kubenswrapper[4735]: I1001 10:19:15.430444 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podStartSLOduration=94.430426525 podStartE2EDuration="1m34.430426525s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:15.429684685 +0000 UTC m=+114.122505957" watchObservedRunningTime="2025-10-01 10:19:15.430426525 +0000 UTC m=+114.123247777" Oct 01 10:19:15 crc kubenswrapper[4735]: I1001 10:19:15.430665 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lcqkv" podStartSLOduration=94.430660201 podStartE2EDuration="1m34.430660201s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:04.38984859 +0000 UTC m=+103.082669852" watchObservedRunningTime="2025-10-01 10:19:15.430660201 +0000 UTC m=+114.123481463" Oct 01 10:19:15 crc kubenswrapper[4735]: I1001 10:19:15.851042 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qm6mr"] Oct 01 10:19:15 crc kubenswrapper[4735]: I1001 10:19:15.851140 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:19:15 crc kubenswrapper[4735]: E1001 10:19:15.851226 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:19:16 crc kubenswrapper[4735]: I1001 10:19:16.896597 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:16 crc kubenswrapper[4735]: I1001 10:19:16.896617 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:16 crc kubenswrapper[4735]: I1001 10:19:16.896679 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:16 crc kubenswrapper[4735]: E1001 10:19:16.896698 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 10:19:16 crc kubenswrapper[4735]: E1001 10:19:16.896818 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 10:19:16 crc kubenswrapper[4735]: E1001 10:19:16.896908 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 10:19:17 crc kubenswrapper[4735]: I1001 10:19:17.896865 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:19:17 crc kubenswrapper[4735]: E1001 10:19:17.897741 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm6mr" podUID="77b56a8b-1a27-4727-b45e-43fbc3847ddd" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.327457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.327629 4735 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.357746 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.358394 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.358451 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.363017 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.364994 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8dk8c"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.366300 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.372948 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbs79"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.374315 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.381532 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.381715 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.381852 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.382017 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.382082 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.382160 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.382213 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.382245 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.382311 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.382394 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.382541 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.382552 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.382609 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.382779 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.383187 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.383315 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.383837 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.383994 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.384157 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.384296 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.384411 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.384455 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.384668 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.384549 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.384930 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sksft"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.385534 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.386055 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.386317 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.386603 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-279pq"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.387136 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.387183 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.387263 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.387845 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.389143 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.389840 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.390465 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.401910 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.401930 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.402684 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sjt7w"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.403170 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rs7ln"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.403548 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.403663 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.403786 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.403873 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.403891 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.403866 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.403987 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.404090 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.407638 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.407766 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.407851 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.407924 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.407998 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408011 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408103 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408139 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408103 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408230 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408297 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408363 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408454 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408547 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408606 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408779 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.408887 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.409016 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.409137 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.409314 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.409442 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.410809 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.411165 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dm5x5"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.411683 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.411886 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dm5x5" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.412691 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.413273 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.415397 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kgr86"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.416191 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.427670 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.430245 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.430780 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.431264 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.431307 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.431331 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.431694 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.431818 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.431846 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.431932 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.432015 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.432019 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.432071 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.432106 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.432180 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.432327 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.432460 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.433321 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.435204 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.435428 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.435579 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.435617 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.447024 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.448048 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453016 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc4804a2-9d77-4820-bebd-93094a7143ec-etcd-client\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453067 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc4804a2-9d77-4820-bebd-93094a7143ec-serving-cert\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453089 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txgzq\" (UniqueName: \"kubernetes.io/projected/dc4804a2-9d77-4820-bebd-93094a7143ec-kube-api-access-txgzq\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453115 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b62c04d-523d-4858-bddf-5d2162392962-encryption-config\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453139 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b62c04d-523d-4858-bddf-5d2162392962-node-pullsecrets\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453169 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc4804a2-9d77-4820-bebd-93094a7143ec-audit-policies\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453199 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b62c04d-523d-4858-bddf-5d2162392962-audit-dir\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453222 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b62c04d-523d-4858-bddf-5d2162392962-etcd-client\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453240 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc4804a2-9d77-4820-bebd-93094a7143ec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453274 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-config\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453296 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-etcd-serving-ca\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453317 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453340 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-config\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453361 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c68nq\" (UniqueName: \"kubernetes.io/projected/8b62c04d-523d-4858-bddf-5d2162392962-kube-api-access-c68nq\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453382 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dc4804a2-9d77-4820-bebd-93094a7143ec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453401 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-client-ca\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453419 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-image-import-ca\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453444 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dc4804a2-9d77-4820-bebd-93094a7143ec-encryption-config\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453481 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d604b23e-ed8c-486f-b7d5-5a5ddd308947-serving-cert\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453532 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b62c04d-523d-4858-bddf-5d2162392962-serving-cert\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c76mg\" (UniqueName: \"kubernetes.io/projected/d604b23e-ed8c-486f-b7d5-5a5ddd308947-kube-api-access-c76mg\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453575 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453664 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc4804a2-9d77-4820-bebd-93094a7143ec-audit-dir\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453709 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-audit\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.453904 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.454035 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.458451 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.459460 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-klpcq"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.460407 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9r92q"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.460936 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.460986 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.461711 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.462159 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-qc826"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.462701 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.464088 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.464403 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.464788 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.465403 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.465623 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.465755 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.465942 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.467100 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.467103 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.467206 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.467286 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.468549 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-xtfsg"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.469258 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.469989 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.470671 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.471021 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.471084 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.471170 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.471343 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.471478 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.471396 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.471518 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.471655 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.471778 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.472214 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.472247 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.472848 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.473559 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.474175 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.474765 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d4p82"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.474801 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.475591 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.476264 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.476755 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.496456 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.497443 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.498686 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.499235 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.499676 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.502350 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.502974 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.503313 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.504564 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.506186 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.506882 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.508134 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.509183 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2jcl8"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.513188 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.521041 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.521316 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.522136 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.522844 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqpz2"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.523878 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.524583 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xzlpn"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.525097 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.526974 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.527535 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.527864 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.528859 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.529570 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.530133 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.532773 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.533829 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.535434 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.537316 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.538812 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.539750 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sjt7w"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.539759 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.541368 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-tg55d"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.542531 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-l67sz"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.543595 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.543692 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.543838 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tg55d" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.545560 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d4p82"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.545786 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.547390 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dm5x5"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.548392 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kgr86"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.549733 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sksft"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.550684 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.553598 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.553633 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.553663 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbs79"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.556949 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-klpcq"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.556975 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.556987 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8dk8c"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.559890 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560513 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-config\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560549 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c68nq\" (UniqueName: \"kubernetes.io/projected/8b62c04d-523d-4858-bddf-5d2162392962-kube-api-access-c68nq\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560575 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3f0508a-69f7-42a8-b39e-58c8d6c12c51-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gbmw6\" (UID: \"c3f0508a-69f7-42a8-b39e-58c8d6c12c51\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560596 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64029635-e98e-458d-ace4-275d8f40d34e-trusted-ca\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560617 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dc4804a2-9d77-4820-bebd-93094a7143ec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560639 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkmzc\" (UID: \"5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560658 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3f0508a-69f7-42a8-b39e-58c8d6c12c51-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gbmw6\" (UID: \"c3f0508a-69f7-42a8-b39e-58c8d6c12c51\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-client-ca\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560699 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-image-import-ca\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560717 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64029635-e98e-458d-ace4-275d8f40d34e-serving-cert\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560738 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wlbf\" (UniqueName: \"kubernetes.io/projected/5e4cc884-e7b2-4c28-9b45-475517f210a8-kube-api-access-6wlbf\") pod \"openshift-controller-manager-operator-756b6f6bc6-kqslj\" (UID: \"5e4cc884-e7b2-4c28-9b45-475517f210a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dc4804a2-9d77-4820-bebd-93094a7143ec-encryption-config\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560784 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d604b23e-ed8c-486f-b7d5-5a5ddd308947-serving-cert\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560802 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b62c04d-523d-4858-bddf-5d2162392962-serving-cert\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560821 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6963210d-abf6-43ad-80ea-72831b6d7504-images\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560848 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gq5j\" (UniqueName: \"kubernetes.io/projected/64029635-e98e-458d-ace4-275d8f40d34e-kube-api-access-2gq5j\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560869 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c76mg\" (UniqueName: \"kubernetes.io/projected/d604b23e-ed8c-486f-b7d5-5a5ddd308947-kube-api-access-c76mg\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560889 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc4804a2-9d77-4820-bebd-93094a7143ec-audit-dir\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkmzc\" (UID: \"5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560933 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz889\" (UniqueName: \"kubernetes.io/projected/6963210d-abf6-43ad-80ea-72831b6d7504-kube-api-access-rz889\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560962 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-audit\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.560994 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b62c04d-523d-4858-bddf-5d2162392962-encryption-config\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561015 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc4804a2-9d77-4820-bebd-93094a7143ec-etcd-client\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561033 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc4804a2-9d77-4820-bebd-93094a7143ec-serving-cert\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txgzq\" (UniqueName: \"kubernetes.io/projected/dc4804a2-9d77-4820-bebd-93094a7143ec-kube-api-access-txgzq\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561075 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b62c04d-523d-4858-bddf-5d2162392962-node-pullsecrets\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561096 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f0508a-69f7-42a8-b39e-58c8d6c12c51-config\") pod \"kube-controller-manager-operator-78b949d7b-gbmw6\" (UID: \"c3f0508a-69f7-42a8-b39e-58c8d6c12c51\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561131 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc4804a2-9d77-4820-bebd-93094a7143ec-audit-policies\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561152 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lzwz\" (UniqueName: \"kubernetes.io/projected/c8642ffb-52ec-44db-b2f8-33a1e98b5328-kube-api-access-9lzwz\") pod \"downloads-7954f5f757-dm5x5\" (UID: \"c8642ffb-52ec-44db-b2f8-33a1e98b5328\") " pod="openshift-console/downloads-7954f5f757-dm5x5" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561173 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b62c04d-523d-4858-bddf-5d2162392962-audit-dir\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561192 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64029635-e98e-458d-ace4-275d8f40d34e-config\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561213 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6963210d-abf6-43ad-80ea-72831b6d7504-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561234 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e4cc884-e7b2-4c28-9b45-475517f210a8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kqslj\" (UID: \"5e4cc884-e7b2-4c28-9b45-475517f210a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561262 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b62c04d-523d-4858-bddf-5d2162392962-etcd-client\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561300 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc4804a2-9d77-4820-bebd-93094a7143ec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561329 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4cc884-e7b2-4c28-9b45-475517f210a8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kqslj\" (UID: \"5e4cc884-e7b2-4c28-9b45-475517f210a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561352 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-etcd-serving-ca\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561371 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkmzc\" (UID: \"5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561391 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-config\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561410 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.561430 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6963210d-abf6-43ad-80ea-72831b6d7504-config\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.562193 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-config\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.562895 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dc4804a2-9d77-4820-bebd-93094a7143ec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.562964 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.562983 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.562992 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9r92q"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.563705 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-client-ca\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.563824 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b62c04d-523d-4858-bddf-5d2162392962-node-pullsecrets\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.564340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc4804a2-9d77-4820-bebd-93094a7143ec-audit-policies\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.564398 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b62c04d-523d-4858-bddf-5d2162392962-audit-dir\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.564417 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-image-import-ca\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.566403 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.566446 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.566458 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2jcl8"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.567891 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.568170 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc4804a2-9d77-4820-bebd-93094a7143ec-audit-dir\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.571570 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.571645 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xzlpn"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.572928 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-etcd-serving-ca\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.573719 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.573966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc4804a2-9d77-4820-bebd-93094a7143ec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.574303 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-audit\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.574563 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-config\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.575264 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rs7ln"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.575903 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b62c04d-523d-4858-bddf-5d2162392962-serving-cert\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.576972 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b62c04d-523d-4858-bddf-5d2162392962-etcd-client\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.577384 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xtfsg"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.577900 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d604b23e-ed8c-486f-b7d5-5a5ddd308947-serving-cert\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.579104 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.580193 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dc4804a2-9d77-4820-bebd-93094a7143ec-encryption-config\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.580366 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc4804a2-9d77-4820-bebd-93094a7143ec-serving-cert\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.580423 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-l67sz"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.580563 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b62c04d-523d-4858-bddf-5d2162392962-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.580891 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b62c04d-523d-4858-bddf-5d2162392962-encryption-config\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.581851 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.581982 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dc4804a2-9d77-4820-bebd-93094a7143ec-etcd-client\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.582029 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.582949 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.584406 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqpz2"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.593774 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fkz86"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.594928 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qmlh7"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.597573 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.597605 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.597748 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.598126 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fkz86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.600260 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.601098 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qmlh7"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.602807 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fkz86"] Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.620864 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.640209 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.660318 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.661849 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f0508a-69f7-42a8-b39e-58c8d6c12c51-config\") pod \"kube-controller-manager-operator-78b949d7b-gbmw6\" (UID: \"c3f0508a-69f7-42a8-b39e-58c8d6c12c51\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.661894 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lzwz\" (UniqueName: \"kubernetes.io/projected/c8642ffb-52ec-44db-b2f8-33a1e98b5328-kube-api-access-9lzwz\") pod \"downloads-7954f5f757-dm5x5\" (UID: \"c8642ffb-52ec-44db-b2f8-33a1e98b5328\") " pod="openshift-console/downloads-7954f5f757-dm5x5" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.661917 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64029635-e98e-458d-ace4-275d8f40d34e-config\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.661933 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6963210d-abf6-43ad-80ea-72831b6d7504-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.661949 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e4cc884-e7b2-4c28-9b45-475517f210a8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kqslj\" (UID: \"5e4cc884-e7b2-4c28-9b45-475517f210a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.661983 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4cc884-e7b2-4c28-9b45-475517f210a8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kqslj\" (UID: \"5e4cc884-e7b2-4c28-9b45-475517f210a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662001 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkmzc\" (UID: \"5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662017 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6963210d-abf6-43ad-80ea-72831b6d7504-config\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662038 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3f0508a-69f7-42a8-b39e-58c8d6c12c51-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gbmw6\" (UID: \"c3f0508a-69f7-42a8-b39e-58c8d6c12c51\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64029635-e98e-458d-ace4-275d8f40d34e-trusted-ca\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662067 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkmzc\" (UID: \"5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3f0508a-69f7-42a8-b39e-58c8d6c12c51-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gbmw6\" (UID: \"c3f0508a-69f7-42a8-b39e-58c8d6c12c51\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662097 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64029635-e98e-458d-ace4-275d8f40d34e-serving-cert\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662111 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wlbf\" (UniqueName: \"kubernetes.io/projected/5e4cc884-e7b2-4c28-9b45-475517f210a8-kube-api-access-6wlbf\") pod \"openshift-controller-manager-operator-756b6f6bc6-kqslj\" (UID: \"5e4cc884-e7b2-4c28-9b45-475517f210a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662128 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6963210d-abf6-43ad-80ea-72831b6d7504-images\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gq5j\" (UniqueName: \"kubernetes.io/projected/64029635-e98e-458d-ace4-275d8f40d34e-kube-api-access-2gq5j\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662171 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz889\" (UniqueName: \"kubernetes.io/projected/6963210d-abf6-43ad-80ea-72831b6d7504-kube-api-access-rz889\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662186 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkmzc\" (UID: \"5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662775 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64029635-e98e-458d-ace4-275d8f40d34e-config\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.662965 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6963210d-abf6-43ad-80ea-72831b6d7504-config\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.663047 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f0508a-69f7-42a8-b39e-58c8d6c12c51-config\") pod \"kube-controller-manager-operator-78b949d7b-gbmw6\" (UID: \"c3f0508a-69f7-42a8-b39e-58c8d6c12c51\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.664215 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6963210d-abf6-43ad-80ea-72831b6d7504-images\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.664238 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64029635-e98e-458d-ace4-275d8f40d34e-trusted-ca\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.665633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3f0508a-69f7-42a8-b39e-58c8d6c12c51-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gbmw6\" (UID: \"c3f0508a-69f7-42a8-b39e-58c8d6c12c51\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.666700 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6963210d-abf6-43ad-80ea-72831b6d7504-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.667035 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64029635-e98e-458d-ace4-275d8f40d34e-serving-cert\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.681742 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.700333 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.720419 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.745646 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.760789 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.781356 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.801059 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.820564 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.861098 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.880825 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.896340 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.896614 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.896783 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.900626 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.920371 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.940585 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.960086 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 01 10:19:18 crc kubenswrapper[4735]: I1001 10:19:18.986602 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.000747 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.020183 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.041124 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.060658 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.080765 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.101062 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.120488 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.140247 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.160363 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.180862 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.200959 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.206518 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkmzc\" (UID: \"5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.221708 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.241217 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.261052 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.263801 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkmzc\" (UID: \"5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.280924 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.300690 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.320708 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.340900 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.361269 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.380108 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.400686 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.407346 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e4cc884-e7b2-4c28-9b45-475517f210a8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kqslj\" (UID: \"5e4cc884-e7b2-4c28-9b45-475517f210a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.420734 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.441201 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.443086 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4cc884-e7b2-4c28-9b45-475517f210a8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kqslj\" (UID: \"5e4cc884-e7b2-4c28-9b45-475517f210a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.460621 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.480649 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.501276 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.519275 4735 request.go:700] Waited for 1.018931694s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.540734 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.560829 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.581033 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.601152 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.620117 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.640256 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.661044 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.680200 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.700858 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.720776 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.740698 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.760312 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.780683 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.806647 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.821751 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.840671 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.861673 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.881172 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.896138 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.901426 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.921040 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.940911 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.960611 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 01 10:19:19 crc kubenswrapper[4735]: I1001 10:19:19.981071 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.000554 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.021132 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.040807 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.060682 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.080697 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.100399 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.110164 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.120737 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.140984 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.161098 4735 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.180147 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.201130 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.221314 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.240858 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.261041 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.298309 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c68nq\" (UniqueName: \"kubernetes.io/projected/8b62c04d-523d-4858-bddf-5d2162392962-kube-api-access-c68nq\") pod \"apiserver-76f77b778f-8dk8c\" (UID: \"8b62c04d-523d-4858-bddf-5d2162392962\") " pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.318737 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txgzq\" (UniqueName: \"kubernetes.io/projected/dc4804a2-9d77-4820-bebd-93094a7143ec-kube-api-access-txgzq\") pod \"apiserver-7bbb656c7d-m7nkv\" (UID: \"dc4804a2-9d77-4820-bebd-93094a7143ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.335437 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c76mg\" (UniqueName: \"kubernetes.io/projected/d604b23e-ed8c-486f-b7d5-5a5ddd308947-kube-api-access-c76mg\") pod \"route-controller-manager-6576b87f9c-625dj\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.341363 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.361432 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.381432 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.401578 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.421362 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.441410 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.461163 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.497521 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lzwz\" (UniqueName: \"kubernetes.io/projected/c8642ffb-52ec-44db-b2f8-33a1e98b5328-kube-api-access-9lzwz\") pod \"downloads-7954f5f757-dm5x5\" (UID: \"c8642ffb-52ec-44db-b2f8-33a1e98b5328\") " pod="openshift-console/downloads-7954f5f757-dm5x5" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.513768 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.524864 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kkmzc\" (UID: \"5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.538727 4735 request.go:700] Waited for 1.875031055s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/serviceaccounts/console-operator/token Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.547287 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3f0508a-69f7-42a8-b39e-58c8d6c12c51-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gbmw6\" (UID: \"c3f0508a-69f7-42a8-b39e-58c8d6c12c51\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.556226 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gq5j\" (UniqueName: \"kubernetes.io/projected/64029635-e98e-458d-ace4-275d8f40d34e-kube-api-access-2gq5j\") pod \"console-operator-58897d9998-sjt7w\" (UID: \"64029635-e98e-458d-ace4-275d8f40d34e\") " pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.573631 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.586520 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wlbf\" (UniqueName: \"kubernetes.io/projected/5e4cc884-e7b2-4c28-9b45-475517f210a8-kube-api-access-6wlbf\") pod \"openshift-controller-manager-operator-756b6f6bc6-kqslj\" (UID: \"5e4cc884-e7b2-4c28-9b45-475517f210a8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.605039 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.605073 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz889\" (UniqueName: \"kubernetes.io/projected/6963210d-abf6-43ad-80ea-72831b6d7504-kube-api-access-rz889\") pod \"machine-api-operator-5694c8668f-kgr86\" (UID: \"6963210d-abf6-43ad-80ea-72831b6d7504\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.622069 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.641443 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.661276 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.667643 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.680751 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.709781 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dm5x5" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.717094 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.720858 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.729851 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv"] Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.741104 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.745173 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.760969 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8dk8c"] Oct 01 10:19:20 crc kubenswrapper[4735]: W1001 10:19:20.780242 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b62c04d_523d_4858_bddf_5d2162392962.slice/crio-27d455b21706c55942d5c1b3b46c997e79e5d77e6bd9a1922a598f8af3b73f77 WatchSource:0}: Error finding container 27d455b21706c55942d5c1b3b46c997e79e5d77e6bd9a1922a598f8af3b73f77: Status 404 returned error can't find the container with id 27d455b21706c55942d5c1b3b46c997e79e5d77e6bd9a1922a598f8af3b73f77 Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.787303 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dkj7\" (UniqueName: \"kubernetes.io/projected/b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11-kube-api-access-7dkj7\") pod \"cluster-samples-operator-665b6dd947-thbzv\" (UID: \"b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.787343 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-auth-proxy-config\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.787367 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjvdp\" (UniqueName: \"kubernetes.io/projected/1ba38415-c2e4-4550-a610-40bcee6d1323-kube-api-access-fjvdp\") pod \"kube-storage-version-migrator-operator-b67b599dd-drrzv\" (UID: \"1ba38415-c2e4-4550-a610-40bcee6d1323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.787412 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.787865 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e03e157-1b07-41e4-80a2-c751ca26e2a7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.787900 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.787921 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.787955 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e814370-2cf9-4f05-ba2a-cf771d55529f-metrics-tls\") pod \"dns-operator-744455d44c-d4p82\" (UID: \"5e814370-2cf9-4f05-ba2a-cf771d55529f\") " pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.787976 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-metrics-tls\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788006 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ba38415-c2e4-4550-a610-40bcee6d1323-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-drrzv\" (UID: \"1ba38415-c2e4-4550-a610-40bcee6d1323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788045 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvcfd\" (UniqueName: \"kubernetes.io/projected/33d4526f-3a59-40c2-b9c9-d93ced5dcd17-kube-api-access-gvcfd\") pod \"machine-config-controller-84d6567774-rcv57\" (UID: \"33d4526f-3a59-40c2-b9c9-d93ced5dcd17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788077 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788121 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-config\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788140 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-serving-cert\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788164 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp99f\" (UniqueName: \"kubernetes.io/projected/a862cad8-e4de-41c5-a83b-734574bc958e-kube-api-access-pp99f\") pod \"openshift-apiserver-operator-796bbdcf4f-hwt8j\" (UID: \"a862cad8-e4de-41c5-a83b-734574bc958e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788184 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788205 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjcn5\" (UniqueName: \"kubernetes.io/projected/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-kube-api-access-vjcn5\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788228 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788252 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2wnd\" (UniqueName: \"kubernetes.io/projected/bf1e6345-2675-413e-bd53-d456e57b08bd-kube-api-access-p2wnd\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788272 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-webhook-cert\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788293 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-oauth-serving-cert\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788315 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-service-ca-bundle\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788336 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-tls\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788356 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/70b90ab7-92ad-43c7-9796-15ec49caae3e-srv-cert\") pod \"olm-operator-6b444d44fb-hvc25\" (UID: \"70b90ab7-92ad-43c7-9796-15ec49caae3e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788390 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-config\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788486 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-dir\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788527 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-trusted-ca\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.788551 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-bound-sa-token\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.789275 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6d59\" (UniqueName: \"kubernetes.io/projected/70b90ab7-92ad-43c7-9796-15ec49caae3e-kube-api-access-k6d59\") pod \"olm-operator-6b444d44fb-hvc25\" (UID: \"70b90ab7-92ad-43c7-9796-15ec49caae3e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.789300 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdx7d\" (UniqueName: \"kubernetes.io/projected/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-kube-api-access-jdx7d\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.789324 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tnzr\" (UniqueName: \"kubernetes.io/projected/5e28238a-9bf2-4e10-827d-7350e0ec0150-kube-api-access-6tnzr\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.789417 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba38415-c2e4-4550-a610-40bcee6d1323-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-drrzv\" (UID: \"1ba38415-c2e4-4550-a610-40bcee6d1323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.789546 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/79bfc45c-3ae9-416b-9d4b-f56fab2387de-stats-auth\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.789579 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cltqg\" (UniqueName: \"kubernetes.io/projected/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-kube-api-access-cltqg\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.790934 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5z7p\" (UniqueName: \"kubernetes.io/projected/1d4a525c-7b1a-4bde-976a-d4b938c27209-kube-api-access-p5z7p\") pod \"control-plane-machine-set-operator-78cbb6b69f-76rns\" (UID: \"1d4a525c-7b1a-4bde-976a-d4b938c27209\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.790986 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/727a1c24-3564-4d03-910d-a1bda2a3667f-serving-cert\") pod \"openshift-config-operator-7777fb866f-dnlqj\" (UID: \"727a1c24-3564-4d03-910d-a1bda2a3667f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791007 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpm8r\" (UniqueName: \"kubernetes.io/projected/727a1c24-3564-4d03-910d-a1bda2a3667f-kube-api-access-lpm8r\") pod \"openshift-config-operator-7777fb866f-dnlqj\" (UID: \"727a1c24-3564-4d03-910d-a1bda2a3667f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791027 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-thbzv\" (UID: \"b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791069 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791088 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-machine-approver-tls\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791115 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f1dfc67-48c4-4c0d-9807-6b962a542d71-etcd-client\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791136 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4f1dfc67-48c4-4c0d-9807-6b962a542d71-etcd-ca\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791156 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791175 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28l6r\" (UniqueName: \"kubernetes.io/projected/4f1dfc67-48c4-4c0d-9807-6b962a542d71-kube-api-access-28l6r\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791195 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-client-ca\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791217 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791238 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33d4526f-3a59-40c2-b9c9-d93ced5dcd17-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rcv57\" (UID: \"33d4526f-3a59-40c2-b9c9-d93ced5dcd17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791273 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-certificates\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791295 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a862cad8-e4de-41c5-a83b-734574bc958e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hwt8j\" (UID: \"a862cad8-e4de-41c5-a83b-734574bc958e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791325 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-service-ca\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791344 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1dfc67-48c4-4c0d-9807-6b962a542d71-config\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791363 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6h2n\" (UniqueName: \"kubernetes.io/projected/8e03e157-1b07-41e4-80a2-c751ca26e2a7-kube-api-access-m6h2n\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791390 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/79bfc45c-3ae9-416b-9d4b-f56fab2387de-default-certificate\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791410 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-tmpfs\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791428 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-config\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791464 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-policies\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791485 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791539 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-serving-cert\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791560 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79bfc45c-3ae9-416b-9d4b-f56fab2387de-metrics-certs\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791584 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf1e6345-2675-413e-bd53-d456e57b08bd-serving-cert\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791604 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/70b90ab7-92ad-43c7-9796-15ec49caae3e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hvc25\" (UID: \"70b90ab7-92ad-43c7-9796-15ec49caae3e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791625 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-apiservice-cert\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791667 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a862cad8-e4de-41c5-a83b-734574bc958e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hwt8j\" (UID: \"a862cad8-e4de-41c5-a83b-734574bc958e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791703 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgg2t\" (UniqueName: \"kubernetes.io/projected/79bfc45c-3ae9-416b-9d4b-f56fab2387de-kube-api-access-fgg2t\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791724 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-config\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791755 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19fd9940-52eb-4a65-8e75-531a27563c1b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791776 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33d4526f-3a59-40c2-b9c9-d93ced5dcd17-proxy-tls\") pod \"machine-config-controller-84d6567774-rcv57\" (UID: \"33d4526f-3a59-40c2-b9c9-d93ced5dcd17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791797 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bfc45c-3ae9-416b-9d4b-f56fab2387de-service-ca-bundle\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791854 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpjrb\" (UniqueName: \"kubernetes.io/projected/6718bff7-4d68-4aa2-ad2b-1511e0799683-kube-api-access-bpjrb\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791874 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klqcz\" (UniqueName: \"kubernetes.io/projected/5e814370-2cf9-4f05-ba2a-cf771d55529f-kube-api-access-klqcz\") pod \"dns-operator-744455d44c-d4p82\" (UID: \"5e814370-2cf9-4f05-ba2a-cf771d55529f\") " pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791894 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd8vw\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-kube-api-access-wd8vw\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-trusted-ca\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791940 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.791975 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-trusted-ca-bundle\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.792000 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e03e157-1b07-41e4-80a2-c751ca26e2a7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.792022 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19fd9940-52eb-4a65-8e75-531a27563c1b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.792040 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-oauth-config\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.792063 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8k2p\" (UniqueName: \"kubernetes.io/projected/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-kube-api-access-c8k2p\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.792277 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.792306 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d4a525c-7b1a-4bde-976a-d4b938c27209-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-76rns\" (UID: \"1d4a525c-7b1a-4bde-976a-d4b938c27209\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.792344 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f1dfc67-48c4-4c0d-9807-6b962a542d71-etcd-service-ca\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.792372 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: E1001 10:19:20.793970 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:21.293954846 +0000 UTC m=+119.986776108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.797212 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f1dfc67-48c4-4c0d-9807-6b962a542d71-serving-cert\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.797257 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e03e157-1b07-41e4-80a2-c751ca26e2a7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.797282 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.797304 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/727a1c24-3564-4d03-910d-a1bda2a3667f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dnlqj\" (UID: \"727a1c24-3564-4d03-910d-a1bda2a3667f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.810130 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.810722 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj"] Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.825538 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" Oct 01 10:19:20 crc kubenswrapper[4735]: W1001 10:19:20.826214 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd604b23e_ed8c_486f_b7d5_5a5ddd308947.slice/crio-255e42b188080997a5dea4e08ffe0b07a757383c89c3f1d2bb04affcee905698 WatchSource:0}: Error finding container 255e42b188080997a5dea4e08ffe0b07a757383c89c3f1d2bb04affcee905698: Status 404 returned error can't find the container with id 255e42b188080997a5dea4e08ffe0b07a757383c89c3f1d2bb04affcee905698 Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.847470 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sjt7w"] Oct 01 10:19:20 crc kubenswrapper[4735]: W1001 10:19:20.858412 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64029635_e98e_458d_ace4_275d8f40d34e.slice/crio-025247d811ab100a1e325b04e3e73dd06821fd126ac1bc564ba6029fbed46c39 WatchSource:0}: Error finding container 025247d811ab100a1e325b04e3e73dd06821fd126ac1bc564ba6029fbed46c39: Status 404 returned error can't find the container with id 025247d811ab100a1e325b04e3e73dd06821fd126ac1bc564ba6029fbed46c39 Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.898675 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899171 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899236 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-images\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899262 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-config\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899278 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxwnz\" (UniqueName: \"kubernetes.io/projected/9999d995-4881-4349-ba17-89eef6722d16-kube-api-access-dxwnz\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899521 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899542 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-serving-cert\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899558 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp99f\" (UniqueName: \"kubernetes.io/projected/a862cad8-e4de-41c5-a83b-734574bc958e-kube-api-access-pp99f\") pod \"openshift-apiserver-operator-796bbdcf4f-hwt8j\" (UID: \"a862cad8-e4de-41c5-a83b-734574bc958e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899597 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c3ecaab8-1253-4559-bc4f-76871e1edff8-srv-cert\") pod \"catalog-operator-68c6474976-t7rvp\" (UID: \"c3ecaab8-1253-4559-bc4f-76871e1edff8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899617 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899633 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjcn5\" (UniqueName: \"kubernetes.io/projected/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-kube-api-access-vjcn5\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899648 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-config\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899686 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2wnd\" (UniqueName: \"kubernetes.io/projected/bf1e6345-2675-413e-bd53-d456e57b08bd-kube-api-access-p2wnd\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899704 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-webhook-cert\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899720 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-oauth-serving-cert\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899756 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-service-ca-bundle\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899774 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5cnf\" (UniqueName: \"kubernetes.io/projected/56f4b38b-92d5-434f-8b9e-dff6c5cb054a-kube-api-access-b5cnf\") pod \"multus-admission-controller-857f4d67dd-2jcl8\" (UID: \"56f4b38b-92d5-434f-8b9e-dff6c5cb054a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899792 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-tls\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899845 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/70b90ab7-92ad-43c7-9796-15ec49caae3e-srv-cert\") pod \"olm-operator-6b444d44fb-hvc25\" (UID: \"70b90ab7-92ad-43c7-9796-15ec49caae3e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899923 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-trusted-ca\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.899949 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-dir\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900031 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-bound-sa-token\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900075 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6d59\" (UniqueName: \"kubernetes.io/projected/70b90ab7-92ad-43c7-9796-15ec49caae3e-kube-api-access-k6d59\") pod \"olm-operator-6b444d44fb-hvc25\" (UID: \"70b90ab7-92ad-43c7-9796-15ec49caae3e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900105 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdx7d\" (UniqueName: \"kubernetes.io/projected/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-kube-api-access-jdx7d\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900139 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8678dbc-4d5c-4081-af9f-b26869f98034-config-volume\") pod \"dns-default-qmlh7\" (UID: \"f8678dbc-4d5c-4081-af9f-b26869f98034\") " pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900170 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tnzr\" (UniqueName: \"kubernetes.io/projected/5e28238a-9bf2-4e10-827d-7350e0ec0150-kube-api-access-6tnzr\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900226 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba38415-c2e4-4550-a610-40bcee6d1323-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-drrzv\" (UID: \"1ba38415-c2e4-4550-a610-40bcee6d1323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900255 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/79bfc45c-3ae9-416b-9d4b-f56fab2387de-stats-auth\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900282 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cltqg\" (UniqueName: \"kubernetes.io/projected/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-kube-api-access-cltqg\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900330 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5z7p\" (UniqueName: \"kubernetes.io/projected/1d4a525c-7b1a-4bde-976a-d4b938c27209-kube-api-access-p5z7p\") pod \"control-plane-machine-set-operator-78cbb6b69f-76rns\" (UID: \"1d4a525c-7b1a-4bde-976a-d4b938c27209\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/727a1c24-3564-4d03-910d-a1bda2a3667f-serving-cert\") pod \"openshift-config-operator-7777fb866f-dnlqj\" (UID: \"727a1c24-3564-4d03-910d-a1bda2a3667f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900391 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpm8r\" (UniqueName: \"kubernetes.io/projected/727a1c24-3564-4d03-910d-a1bda2a3667f-kube-api-access-lpm8r\") pod \"openshift-config-operator-7777fb866f-dnlqj\" (UID: \"727a1c24-3564-4d03-910d-a1bda2a3667f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900415 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-thbzv\" (UID: \"b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f1dfc67-48c4-4c0d-9807-6b962a542d71-etcd-client\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900544 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900672 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-machine-approver-tls\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900700 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9knlc\" (UniqueName: \"kubernetes.io/projected/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-kube-api-access-9knlc\") pod \"marketplace-operator-79b997595-cqpz2\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900762 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900819 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4f1dfc67-48c4-4c0d-9807-6b962a542d71-etcd-ca\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900897 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-certificates\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900929 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28l6r\" (UniqueName: \"kubernetes.io/projected/4f1dfc67-48c4-4c0d-9807-6b962a542d71-kube-api-access-28l6r\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.900955 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-client-ca\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: E1001 10:19:20.901018 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:21.400993923 +0000 UTC m=+120.093815285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901047 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901077 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33d4526f-3a59-40c2-b9c9-d93ced5dcd17-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rcv57\" (UID: \"33d4526f-3a59-40c2-b9c9-d93ced5dcd17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901109 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xql4l\" (UniqueName: \"kubernetes.io/projected/c3ecaab8-1253-4559-bc4f-76871e1edff8-kube-api-access-xql4l\") pod \"catalog-operator-68c6474976-t7rvp\" (UID: \"c3ecaab8-1253-4559-bc4f-76871e1edff8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901145 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-service-ca\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901179 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a862cad8-e4de-41c5-a83b-734574bc958e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hwt8j\" (UID: \"a862cad8-e4de-41c5-a83b-734574bc958e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901201 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbfj9\" (UniqueName: \"kubernetes.io/projected/f8678dbc-4d5c-4081-af9f-b26869f98034-kube-api-access-lbfj9\") pod \"dns-default-qmlh7\" (UID: \"f8678dbc-4d5c-4081-af9f-b26869f98034\") " pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901253 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1dfc67-48c4-4c0d-9807-6b962a542d71-config\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6h2n\" (UniqueName: \"kubernetes.io/projected/8e03e157-1b07-41e4-80a2-c751ca26e2a7-kube-api-access-m6h2n\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901315 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-config\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901339 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/79bfc45c-3ae9-416b-9d4b-f56fab2387de-default-certificate\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901382 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a73dbc-e661-4ceb-80cb-66a21c6895e1-config\") pod \"kube-apiserver-operator-766d6c64bb-nwvhh\" (UID: \"28a73dbc-e661-4ceb-80cb-66a21c6895e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901407 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-tmpfs\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901431 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81c0d8ce-d766-46bb-bd2b-79043b149331-cert\") pod \"ingress-canary-fkz86\" (UID: \"81c0d8ce-d766-46bb-bd2b-79043b149331\") " pod="openshift-ingress-canary/ingress-canary-fkz86" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-policies\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901450 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-config\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901481 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901540 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28a73dbc-e661-4ceb-80cb-66a21c6895e1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nwvhh\" (UID: \"28a73dbc-e661-4ceb-80cb-66a21c6895e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901568 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-serving-cert\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901590 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79bfc45c-3ae9-416b-9d4b-f56fab2387de-metrics-certs\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901613 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447-certs\") pod \"machine-config-server-tg55d\" (UID: \"4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447\") " pod="openshift-machine-config-operator/machine-config-server-tg55d" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901641 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf1e6345-2675-413e-bd53-d456e57b08bd-serving-cert\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901662 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-config-volume\") pod \"collect-profiles-29321895-x8dfs\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901688 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/70b90ab7-92ad-43c7-9796-15ec49caae3e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hvc25\" (UID: \"70b90ab7-92ad-43c7-9796-15ec49caae3e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901718 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-apiservice-cert\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901742 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447-node-bootstrap-token\") pod \"machine-config-server-tg55d\" (UID: \"4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447\") " pod="openshift-machine-config-operator/machine-config-server-tg55d" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901794 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a862cad8-e4de-41c5-a83b-734574bc958e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hwt8j\" (UID: \"a862cad8-e4de-41c5-a83b-734574bc958e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-auth-proxy-config\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901844 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2-serving-cert\") pod \"service-ca-operator-777779d784-cx2v2\" (UID: \"fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901904 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cqpz2\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901928 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-proxy-tls\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901952 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f22tm\" (UniqueName: \"kubernetes.io/projected/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-kube-api-access-f22tm\") pod \"collect-profiles-29321895-x8dfs\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901949 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-client-ca\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.901978 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgg2t\" (UniqueName: \"kubernetes.io/projected/79bfc45c-3ae9-416b-9d4b-f56fab2387de-kube-api-access-fgg2t\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902036 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9w8t\" (UniqueName: \"kubernetes.io/projected/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-kube-api-access-j9w8t\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902088 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-config\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902108 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19fd9940-52eb-4a65-8e75-531a27563c1b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902136 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/76458fc4-8f1e-462d-8a5c-1af31c52f7b6-signing-cabundle\") pod \"service-ca-9c57cc56f-xzlpn\" (UID: \"76458fc4-8f1e-462d-8a5c-1af31c52f7b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902157 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33d4526f-3a59-40c2-b9c9-d93ced5dcd17-proxy-tls\") pod \"machine-config-controller-84d6567774-rcv57\" (UID: \"33d4526f-3a59-40c2-b9c9-d93ced5dcd17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902177 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902194 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bfc45c-3ae9-416b-9d4b-f56fab2387de-service-ca-bundle\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902213 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-csi-data-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902235 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd8vw\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-kube-api-access-wd8vw\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902254 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-trusted-ca\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902270 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpjrb\" (UniqueName: \"kubernetes.io/projected/6718bff7-4d68-4aa2-ad2b-1511e0799683-kube-api-access-bpjrb\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902292 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klqcz\" (UniqueName: \"kubernetes.io/projected/5e814370-2cf9-4f05-ba2a-cf771d55529f-kube-api-access-klqcz\") pod \"dns-operator-744455d44c-d4p82\" (UID: \"5e814370-2cf9-4f05-ba2a-cf771d55529f\") " pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902325 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba38415-c2e4-4550-a610-40bcee6d1323-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-drrzv\" (UID: \"1ba38415-c2e4-4550-a610-40bcee6d1323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902336 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902433 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-trusted-ca-bundle\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902462 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-secret-volume\") pod \"collect-profiles-29321895-x8dfs\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902482 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8678dbc-4d5c-4081-af9f-b26869f98034-metrics-tls\") pod \"dns-default-qmlh7\" (UID: \"f8678dbc-4d5c-4081-af9f-b26869f98034\") " pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:20 crc kubenswrapper[4735]: E1001 10:19:20.902621 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:21.402598276 +0000 UTC m=+120.095419538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902645 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpw5s\" (UniqueName: \"kubernetes.io/projected/4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447-kube-api-access-hpw5s\") pod \"machine-config-server-tg55d\" (UID: \"4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447\") " pod="openshift-machine-config-operator/machine-config-server-tg55d" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902678 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e03e157-1b07-41e4-80a2-c751ca26e2a7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902695 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cqpz2\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902714 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28a73dbc-e661-4ceb-80cb-66a21c6895e1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nwvhh\" (UID: \"28a73dbc-e661-4ceb-80cb-66a21c6895e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902733 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq9s7\" (UniqueName: \"kubernetes.io/projected/fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2-kube-api-access-mq9s7\") pod \"service-ca-operator-777779d784-cx2v2\" (UID: \"fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902767 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19fd9940-52eb-4a65-8e75-531a27563c1b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902787 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-oauth-config\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902805 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-socket-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.902822 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/76458fc4-8f1e-462d-8a5c-1af31c52f7b6-signing-key\") pod \"service-ca-9c57cc56f-xzlpn\" (UID: \"76458fc4-8f1e-462d-8a5c-1af31c52f7b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.903476 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-config\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.903940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-config\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.904453 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-plugins-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.904518 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8k2p\" (UniqueName: \"kubernetes.io/projected/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-kube-api-access-c8k2p\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.904540 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.904574 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d4a525c-7b1a-4bde-976a-d4b938c27209-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-76rns\" (UID: \"1d4a525c-7b1a-4bde-976a-d4b938c27209\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.904598 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncdsf\" (UniqueName: \"kubernetes.io/projected/81c0d8ce-d766-46bb-bd2b-79043b149331-kube-api-access-ncdsf\") pod \"ingress-canary-fkz86\" (UID: \"81c0d8ce-d766-46bb-bd2b-79043b149331\") " pod="openshift-ingress-canary/ingress-canary-fkz86" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.904614 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz7hh\" (UniqueName: \"kubernetes.io/projected/afb4417d-4b1d-4d1e-8724-a78b3781b4f1-kube-api-access-bz7hh\") pod \"migrator-59844c95c7-sngf8\" (UID: \"afb4417d-4b1d-4d1e-8724-a78b3781b4f1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905092 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33d4526f-3a59-40c2-b9c9-d93ced5dcd17-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rcv57\" (UID: \"33d4526f-3a59-40c2-b9c9-d93ced5dcd17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905528 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-dir\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905630 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f1dfc67-48c4-4c0d-9807-6b962a542d71-etcd-service-ca\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905656 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fswbb\" (UniqueName: \"kubernetes.io/projected/76458fc4-8f1e-462d-8a5c-1af31c52f7b6-kube-api-access-fswbb\") pod \"service-ca-9c57cc56f-xzlpn\" (UID: \"76458fc4-8f1e-462d-8a5c-1af31c52f7b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905674 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9631d017-2723-4e2f-b285-c38333929edf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fkcd4\" (UID: \"9631d017-2723-4e2f-b285-c38333929edf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905704 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-mountpoint-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905754 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905775 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905793 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f1dfc67-48c4-4c0d-9807-6b962a542d71-serving-cert\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905810 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e03e157-1b07-41e4-80a2-c751ca26e2a7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905832 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/727a1c24-3564-4d03-910d-a1bda2a3667f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dnlqj\" (UID: \"727a1c24-3564-4d03-910d-a1bda2a3667f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905850 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/56f4b38b-92d5-434f-8b9e-dff6c5cb054a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2jcl8\" (UID: \"56f4b38b-92d5-434f-8b9e-dff6c5cb054a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905867 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2-config\") pod \"service-ca-operator-777779d784-cx2v2\" (UID: \"fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905897 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjvdp\" (UniqueName: \"kubernetes.io/projected/1ba38415-c2e4-4550-a610-40bcee6d1323-kube-api-access-fjvdp\") pod \"kube-storage-version-migrator-operator-b67b599dd-drrzv\" (UID: \"1ba38415-c2e4-4550-a610-40bcee6d1323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905915 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dkj7\" (UniqueName: \"kubernetes.io/projected/b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11-kube-api-access-7dkj7\") pod \"cluster-samples-operator-665b6dd947-thbzv\" (UID: \"b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905933 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-auth-proxy-config\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905963 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c3ecaab8-1253-4559-bc4f-76871e1edff8-profile-collector-cert\") pod \"catalog-operator-68c6474976-t7rvp\" (UID: \"c3ecaab8-1253-4559-bc4f-76871e1edff8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.905979 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.906014 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e03e157-1b07-41e4-80a2-c751ca26e2a7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.906031 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.906049 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.906067 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-registration-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.906085 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e814370-2cf9-4f05-ba2a-cf771d55529f-metrics-tls\") pod \"dns-operator-744455d44c-d4p82\" (UID: \"5e814370-2cf9-4f05-ba2a-cf771d55529f\") " pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.906102 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzlj4\" (UniqueName: \"kubernetes.io/projected/9631d017-2723-4e2f-b285-c38333929edf-kube-api-access-fzlj4\") pod \"package-server-manager-789f6589d5-fkcd4\" (UID: \"9631d017-2723-4e2f-b285-c38333929edf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.906119 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-metrics-tls\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.906136 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ba38415-c2e4-4550-a610-40bcee6d1323-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-drrzv\" (UID: \"1ba38415-c2e4-4550-a610-40bcee6d1323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.906175 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvcfd\" (UniqueName: \"kubernetes.io/projected/33d4526f-3a59-40c2-b9c9-d93ced5dcd17-kube-api-access-gvcfd\") pod \"machine-config-controller-84d6567774-rcv57\" (UID: \"33d4526f-3a59-40c2-b9c9-d93ced5dcd17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.906211 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-oauth-serving-cert\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.906674 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-service-ca-bundle\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.907988 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-config\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.908266 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19fd9940-52eb-4a65-8e75-531a27563c1b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.908630 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-trusted-ca-bundle\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.909198 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-service-ca\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.911372 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.912181 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a862cad8-e4de-41c5-a83b-734574bc958e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-hwt8j\" (UID: \"a862cad8-e4de-41c5-a83b-734574bc958e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.913081 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-trusted-ca\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.913137 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1dfc67-48c4-4c0d-9807-6b962a542d71-config\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.913991 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.916211 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f1dfc67-48c4-4c0d-9807-6b962a542d71-etcd-service-ca\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.918867 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/727a1c24-3564-4d03-910d-a1bda2a3667f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dnlqj\" (UID: \"727a1c24-3564-4d03-910d-a1bda2a3667f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.919072 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-auth-proxy-config\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.919765 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.920039 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4f1dfc67-48c4-4c0d-9807-6b962a542d71-etcd-ca\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.920391 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.920428 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.920873 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-tmpfs\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.921838 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-policies\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.922822 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79bfc45c-3ae9-416b-9d4b-f56fab2387de-service-ca-bundle\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.925110 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-certificates\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.926347 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-trusted-ca\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.931338 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-thbzv\" (UID: \"b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.931914 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e03e157-1b07-41e4-80a2-c751ca26e2a7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.931909 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kgr86"] Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.936351 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/70b90ab7-92ad-43c7-9796-15ec49caae3e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hvc25\" (UID: \"70b90ab7-92ad-43c7-9796-15ec49caae3e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.936615 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19fd9940-52eb-4a65-8e75-531a27563c1b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.937245 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.939267 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-apiservice-cert\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.940862 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.943236 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/79bfc45c-3ae9-416b-9d4b-f56fab2387de-stats-auth\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.943358 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf1e6345-2675-413e-bd53-d456e57b08bd-serving-cert\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.945071 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/70b90ab7-92ad-43c7-9796-15ec49caae3e-srv-cert\") pod \"olm-operator-6b444d44fb-hvc25\" (UID: \"70b90ab7-92ad-43c7-9796-15ec49caae3e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.946198 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33d4526f-3a59-40c2-b9c9-d93ced5dcd17-proxy-tls\") pod \"machine-config-controller-84d6567774-rcv57\" (UID: \"33d4526f-3a59-40c2-b9c9-d93ced5dcd17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.947903 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8e03e157-1b07-41e4-80a2-c751ca26e2a7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.948319 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-oauth-config\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.948819 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a862cad8-e4de-41c5-a83b-734574bc958e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-hwt8j\" (UID: \"a862cad8-e4de-41c5-a83b-734574bc958e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.949339 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-webhook-cert\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.949558 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.949615 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.949764 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.949918 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-serving-cert\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.950413 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/727a1c24-3564-4d03-910d-a1bda2a3667f-serving-cert\") pod \"openshift-config-operator-7777fb866f-dnlqj\" (UID: \"727a1c24-3564-4d03-910d-a1bda2a3667f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.952764 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-metrics-tls\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.954175 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79bfc45c-3ae9-416b-9d4b-f56fab2387de-metrics-certs\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.954902 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6h2n\" (UniqueName: \"kubernetes.io/projected/8e03e157-1b07-41e4-80a2-c751ca26e2a7-kube-api-access-m6h2n\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.956017 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.956165 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5e814370-2cf9-4f05-ba2a-cf771d55529f-metrics-tls\") pod \"dns-operator-744455d44c-d4p82\" (UID: \"5e814370-2cf9-4f05-ba2a-cf771d55529f\") " pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.960116 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f1dfc67-48c4-4c0d-9807-6b962a542d71-serving-cert\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.961195 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d4a525c-7b1a-4bde-976a-d4b938c27209-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-76rns\" (UID: \"1d4a525c-7b1a-4bde-976a-d4b938c27209\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.964822 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ba38415-c2e4-4550-a610-40bcee6d1323-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-drrzv\" (UID: \"1ba38415-c2e4-4550-a610-40bcee6d1323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.966138 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dm5x5"] Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.966473 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-serving-cert\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.967107 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.967515 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f1dfc67-48c4-4c0d-9807-6b962a542d71-etcd-client\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.968724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.972337 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-machine-approver-tls\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.975993 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-tls\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.976181 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.976756 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/79bfc45c-3ae9-416b-9d4b-f56fab2387de-default-certificate\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.986387 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjcn5\" (UniqueName: \"kubernetes.io/projected/4f6a88e2-4257-4ed0-9381-6dcd9d5647bb-kube-api-access-vjcn5\") pod \"ingress-operator-5b745b69d9-8dkrf\" (UID: \"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:20 crc kubenswrapper[4735]: W1001 10:19:20.994784 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8642ffb_52ec_44db_b2f8_33a1e98b5328.slice/crio-6776b949e7b030f39ff815a75fc865d5de4915218e48d36a1a5cff2f150684f0 WatchSource:0}: Error finding container 6776b949e7b030f39ff815a75fc865d5de4915218e48d36a1a5cff2f150684f0: Status 404 returned error can't find the container with id 6776b949e7b030f39ff815a75fc865d5de4915218e48d36a1a5cff2f150684f0 Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.995940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgg2t\" (UniqueName: \"kubernetes.io/projected/79bfc45c-3ae9-416b-9d4b-f56fab2387de-kube-api-access-fgg2t\") pod \"router-default-5444994796-qc826\" (UID: \"79bfc45c-3ae9-416b-9d4b-f56fab2387de\") " pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:20 crc kubenswrapper[4735]: I1001 10:19:20.996432 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6"] Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.007454 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.007720 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-images\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.007751 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxwnz\" (UniqueName: \"kubernetes.io/projected/9999d995-4881-4349-ba17-89eef6722d16-kube-api-access-dxwnz\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.007784 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c3ecaab8-1253-4559-bc4f-76871e1edff8-srv-cert\") pod \"catalog-operator-68c6474976-t7rvp\" (UID: \"c3ecaab8-1253-4559-bc4f-76871e1edff8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.007818 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5cnf\" (UniqueName: \"kubernetes.io/projected/56f4b38b-92d5-434f-8b9e-dff6c5cb054a-kube-api-access-b5cnf\") pod \"multus-admission-controller-857f4d67dd-2jcl8\" (UID: \"56f4b38b-92d5-434f-8b9e-dff6c5cb054a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.007872 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8678dbc-4d5c-4081-af9f-b26869f98034-config-volume\") pod \"dns-default-qmlh7\" (UID: \"f8678dbc-4d5c-4081-af9f-b26869f98034\") " pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.007941 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9knlc\" (UniqueName: \"kubernetes.io/projected/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-kube-api-access-9knlc\") pod \"marketplace-operator-79b997595-cqpz2\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xql4l\" (UniqueName: \"kubernetes.io/projected/c3ecaab8-1253-4559-bc4f-76871e1edff8-kube-api-access-xql4l\") pod \"catalog-operator-68c6474976-t7rvp\" (UID: \"c3ecaab8-1253-4559-bc4f-76871e1edff8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008054 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbfj9\" (UniqueName: \"kubernetes.io/projected/f8678dbc-4d5c-4081-af9f-b26869f98034-kube-api-access-lbfj9\") pod \"dns-default-qmlh7\" (UID: \"f8678dbc-4d5c-4081-af9f-b26869f98034\") " pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a73dbc-e661-4ceb-80cb-66a21c6895e1-config\") pod \"kube-apiserver-operator-766d6c64bb-nwvhh\" (UID: \"28a73dbc-e661-4ceb-80cb-66a21c6895e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008111 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81c0d8ce-d766-46bb-bd2b-79043b149331-cert\") pod \"ingress-canary-fkz86\" (UID: \"81c0d8ce-d766-46bb-bd2b-79043b149331\") " pod="openshift-ingress-canary/ingress-canary-fkz86" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008152 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28a73dbc-e661-4ceb-80cb-66a21c6895e1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nwvhh\" (UID: \"28a73dbc-e661-4ceb-80cb-66a21c6895e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008177 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447-certs\") pod \"machine-config-server-tg55d\" (UID: \"4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447\") " pod="openshift-machine-config-operator/machine-config-server-tg55d" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008201 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-config-volume\") pod \"collect-profiles-29321895-x8dfs\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008228 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447-node-bootstrap-token\") pod \"machine-config-server-tg55d\" (UID: \"4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447\") " pod="openshift-machine-config-operator/machine-config-server-tg55d" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008256 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-auth-proxy-config\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008281 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2-serving-cert\") pod \"service-ca-operator-777779d784-cx2v2\" (UID: \"fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cqpz2\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008346 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-proxy-tls\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008373 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f22tm\" (UniqueName: \"kubernetes.io/projected/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-kube-api-access-f22tm\") pod \"collect-profiles-29321895-x8dfs\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008399 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9w8t\" (UniqueName: \"kubernetes.io/projected/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-kube-api-access-j9w8t\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008441 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/76458fc4-8f1e-462d-8a5c-1af31c52f7b6-signing-cabundle\") pod \"service-ca-9c57cc56f-xzlpn\" (UID: \"76458fc4-8f1e-462d-8a5c-1af31c52f7b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008466 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-csi-data-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008573 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpw5s\" (UniqueName: \"kubernetes.io/projected/4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447-kube-api-access-hpw5s\") pod \"machine-config-server-tg55d\" (UID: \"4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447\") " pod="openshift-machine-config-operator/machine-config-server-tg55d" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008601 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-secret-volume\") pod \"collect-profiles-29321895-x8dfs\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.008623 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:21.508602416 +0000 UTC m=+120.201423678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008663 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8678dbc-4d5c-4081-af9f-b26869f98034-metrics-tls\") pod \"dns-default-qmlh7\" (UID: \"f8678dbc-4d5c-4081-af9f-b26869f98034\") " pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008705 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cqpz2\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008722 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28a73dbc-e661-4ceb-80cb-66a21c6895e1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nwvhh\" (UID: \"28a73dbc-e661-4ceb-80cb-66a21c6895e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008740 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq9s7\" (UniqueName: \"kubernetes.io/projected/fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2-kube-api-access-mq9s7\") pod \"service-ca-operator-777779d784-cx2v2\" (UID: \"fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008763 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-socket-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008783 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/76458fc4-8f1e-462d-8a5c-1af31c52f7b6-signing-key\") pod \"service-ca-9c57cc56f-xzlpn\" (UID: \"76458fc4-8f1e-462d-8a5c-1af31c52f7b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008799 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-plugins-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008824 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncdsf\" (UniqueName: \"kubernetes.io/projected/81c0d8ce-d766-46bb-bd2b-79043b149331-kube-api-access-ncdsf\") pod \"ingress-canary-fkz86\" (UID: \"81c0d8ce-d766-46bb-bd2b-79043b149331\") " pod="openshift-ingress-canary/ingress-canary-fkz86" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008839 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz7hh\" (UniqueName: \"kubernetes.io/projected/afb4417d-4b1d-4d1e-8724-a78b3781b4f1-kube-api-access-bz7hh\") pod \"migrator-59844c95c7-sngf8\" (UID: \"afb4417d-4b1d-4d1e-8724-a78b3781b4f1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008855 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fswbb\" (UniqueName: \"kubernetes.io/projected/76458fc4-8f1e-462d-8a5c-1af31c52f7b6-kube-api-access-fswbb\") pod \"service-ca-9c57cc56f-xzlpn\" (UID: \"76458fc4-8f1e-462d-8a5c-1af31c52f7b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008871 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9631d017-2723-4e2f-b285-c38333929edf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fkcd4\" (UID: \"9631d017-2723-4e2f-b285-c38333929edf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008885 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-mountpoint-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008909 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/56f4b38b-92d5-434f-8b9e-dff6c5cb054a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2jcl8\" (UID: \"56f4b38b-92d5-434f-8b9e-dff6c5cb054a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008925 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2-config\") pod \"service-ca-operator-777779d784-cx2v2\" (UID: \"fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008963 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c3ecaab8-1253-4559-bc4f-76871e1edff8-profile-collector-cert\") pod \"catalog-operator-68c6474976-t7rvp\" (UID: \"c3ecaab8-1253-4559-bc4f-76871e1edff8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.008986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-registration-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.009002 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzlj4\" (UniqueName: \"kubernetes.io/projected/9631d017-2723-4e2f-b285-c38333929edf-kube-api-access-fzlj4\") pod \"package-server-manager-789f6589d5-fkcd4\" (UID: \"9631d017-2723-4e2f-b285-c38333929edf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.009707 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-images\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.010053 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a73dbc-e661-4ceb-80cb-66a21c6895e1-config\") pod \"kube-apiserver-operator-766d6c64bb-nwvhh\" (UID: \"28a73dbc-e661-4ceb-80cb-66a21c6895e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.010685 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-plugins-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.010784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-registration-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.011633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447-node-bootstrap-token\") pod \"machine-config-server-tg55d\" (UID: \"4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447\") " pod="openshift-machine-config-operator/machine-config-server-tg55d" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.011632 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/76458fc4-8f1e-462d-8a5c-1af31c52f7b6-signing-cabundle\") pod \"service-ca-9c57cc56f-xzlpn\" (UID: \"76458fc4-8f1e-462d-8a5c-1af31c52f7b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.011724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-csi-data-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.011789 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-mountpoint-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.012660 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-config-volume\") pod \"collect-profiles-29321895-x8dfs\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.012795 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-auth-proxy-config\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.013405 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9999d995-4881-4349-ba17-89eef6722d16-socket-dir\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.014705 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9631d017-2723-4e2f-b285-c38333929edf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fkcd4\" (UID: \"9631d017-2723-4e2f-b285-c38333929edf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.014873 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c3ecaab8-1253-4559-bc4f-76871e1edff8-srv-cert\") pod \"catalog-operator-68c6474976-t7rvp\" (UID: \"c3ecaab8-1253-4559-bc4f-76871e1edff8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.016031 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-secret-volume\") pod \"collect-profiles-29321895-x8dfs\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.016683 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cqpz2\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.016724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/76458fc4-8f1e-462d-8a5c-1af31c52f7b6-signing-key\") pod \"service-ca-9c57cc56f-xzlpn\" (UID: \"76458fc4-8f1e-462d-8a5c-1af31c52f7b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.017412 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c3ecaab8-1253-4559-bc4f-76871e1edff8-profile-collector-cert\") pod \"catalog-operator-68c6474976-t7rvp\" (UID: \"c3ecaab8-1253-4559-bc4f-76871e1edff8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.017982 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp99f\" (UniqueName: \"kubernetes.io/projected/a862cad8-e4de-41c5-a83b-734574bc958e-kube-api-access-pp99f\") pod \"openshift-apiserver-operator-796bbdcf4f-hwt8j\" (UID: \"a862cad8-e4de-41c5-a83b-734574bc958e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.018298 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447-certs\") pod \"machine-config-server-tg55d\" (UID: \"4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447\") " pod="openshift-machine-config-operator/machine-config-server-tg55d" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.018439 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cqpz2\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.019541 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8678dbc-4d5c-4081-af9f-b26869f98034-config-volume\") pod \"dns-default-qmlh7\" (UID: \"f8678dbc-4d5c-4081-af9f-b26869f98034\") " pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.019638 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2-serving-cert\") pod \"service-ca-operator-777779d784-cx2v2\" (UID: \"fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.020028 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/56f4b38b-92d5-434f-8b9e-dff6c5cb054a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2jcl8\" (UID: \"56f4b38b-92d5-434f-8b9e-dff6c5cb054a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.020065 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28a73dbc-e661-4ceb-80cb-66a21c6895e1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nwvhh\" (UID: \"28a73dbc-e661-4ceb-80cb-66a21c6895e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.020337 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2-config\") pod \"service-ca-operator-777779d784-cx2v2\" (UID: \"fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.021464 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8678dbc-4d5c-4081-af9f-b26869f98034-metrics-tls\") pod \"dns-default-qmlh7\" (UID: \"f8678dbc-4d5c-4081-af9f-b26869f98034\") " pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.023989 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-proxy-tls\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.028857 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81c0d8ce-d766-46bb-bd2b-79043b149331-cert\") pod \"ingress-canary-fkz86\" (UID: \"81c0d8ce-d766-46bb-bd2b-79043b149331\") " pod="openshift-ingress-canary/ingress-canary-fkz86" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.052589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2wnd\" (UniqueName: \"kubernetes.io/projected/bf1e6345-2675-413e-bd53-d456e57b08bd-kube-api-access-p2wnd\") pod \"controller-manager-879f6c89f-wbs79\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.053076 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.054796 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc"] Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.062449 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.066984 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5z7p\" (UniqueName: \"kubernetes.io/projected/1d4a525c-7b1a-4bde-976a-d4b938c27209-kube-api-access-p5z7p\") pod \"control-plane-machine-set-operator-78cbb6b69f-76rns\" (UID: \"1d4a525c-7b1a-4bde-976a-d4b938c27209\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.078284 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-bound-sa-token\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:21 crc kubenswrapper[4735]: W1001 10:19:21.105661 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79bfc45c_3ae9_416b_9d4b_f56fab2387de.slice/crio-2e437119120e225a1af2d79b6ce317db4be6686e161c474b6490f370d030a560 WatchSource:0}: Error finding container 2e437119120e225a1af2d79b6ce317db4be6686e161c474b6490f370d030a560: Status 404 returned error can't find the container with id 2e437119120e225a1af2d79b6ce317db4be6686e161c474b6490f370d030a560 Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.107472 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvcfd\" (UniqueName: \"kubernetes.io/projected/33d4526f-3a59-40c2-b9c9-d93ced5dcd17-kube-api-access-gvcfd\") pod \"machine-config-controller-84d6567774-rcv57\" (UID: \"33d4526f-3a59-40c2-b9c9-d93ced5dcd17\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.109894 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.110232 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:21.610214727 +0000 UTC m=+120.303035989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.110734 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj"] Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.118610 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.121312 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6d59\" (UniqueName: \"kubernetes.io/projected/70b90ab7-92ad-43c7-9796-15ec49caae3e-kube-api-access-k6d59\") pod \"olm-operator-6b444d44fb-hvc25\" (UID: \"70b90ab7-92ad-43c7-9796-15ec49caae3e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.138035 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.139732 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e03e157-1b07-41e4-80a2-c751ca26e2a7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7962v\" (UID: \"8e03e157-1b07-41e4-80a2-c751ca26e2a7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.159418 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdx7d\" (UniqueName: \"kubernetes.io/projected/2245b41c-0ccc-47ae-92c8-84ecfd80c53e-kube-api-access-jdx7d\") pod \"packageserver-d55dfcdfc-fxcpc\" (UID: \"2245b41c-0ccc-47ae-92c8-84ecfd80c53e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.182028 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tnzr\" (UniqueName: \"kubernetes.io/projected/5e28238a-9bf2-4e10-827d-7350e0ec0150-kube-api-access-6tnzr\") pod \"console-f9d7485db-xtfsg\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.205727 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cltqg\" (UniqueName: \"kubernetes.io/projected/2236757b-1fd9-4aea-9b6d-e5f56ae7ff42-kube-api-access-cltqg\") pod \"machine-approver-56656f9798-279pq\" (UID: \"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.211903 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.217068 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klqcz\" (UniqueName: \"kubernetes.io/projected/5e814370-2cf9-4f05-ba2a-cf771d55529f-kube-api-access-klqcz\") pod \"dns-operator-744455d44c-d4p82\" (UID: \"5e814370-2cf9-4f05-ba2a-cf771d55529f\") " pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.217529 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:21.717490991 +0000 UTC m=+120.410312253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.218718 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.219215 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:21.719194006 +0000 UTC m=+120.412015268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: W1001 10:19:21.220305 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e4cc884_e7b2_4c28_9b45_475517f210a8.slice/crio-962e5c6187e44193dd88fda1de954492cd483c27513c39f010ccc0a119c2b6c2 WatchSource:0}: Error finding container 962e5c6187e44193dd88fda1de954492cd483c27513c39f010ccc0a119c2b6c2: Status 404 returned error can't find the container with id 962e5c6187e44193dd88fda1de954492cd483c27513c39f010ccc0a119c2b6c2 Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.232313 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.237517 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.240313 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpjrb\" (UniqueName: \"kubernetes.io/projected/6718bff7-4d68-4aa2-ad2b-1511e0799683-kube-api-access-bpjrb\") pod \"oauth-openshift-558db77b4-rs7ln\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.259180 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8k2p\" (UniqueName: \"kubernetes.io/projected/97d879b1-ea51-444b-8c31-5dc89c3a1fb2-kube-api-access-c8k2p\") pod \"authentication-operator-69f744f599-sksft\" (UID: \"97d879b1-ea51-444b-8c31-5dc89c3a1fb2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.293591 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf"] Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.303637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjvdp\" (UniqueName: \"kubernetes.io/projected/1ba38415-c2e4-4550-a610-40bcee6d1323-kube-api-access-fjvdp\") pod \"kube-storage-version-migrator-operator-b67b599dd-drrzv\" (UID: \"1ba38415-c2e4-4550-a610-40bcee6d1323\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.303940 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:21 crc kubenswrapper[4735]: W1001 10:19:21.306865 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2236757b_1fd9_4aea_9b6d_e5f56ae7ff42.slice/crio-440e77dff2ad886f8da794756f9c7ae126af19a310b656737dec85e6bdd8ad9e WatchSource:0}: Error finding container 440e77dff2ad886f8da794756f9c7ae126af19a310b656737dec85e6bdd8ad9e: Status 404 returned error can't find the container with id 440e77dff2ad886f8da794756f9c7ae126af19a310b656737dec85e6bdd8ad9e Oct 01 10:19:21 crc kubenswrapper[4735]: W1001 10:19:21.308706 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f6a88e2_4257_4ed0_9381_6dcd9d5647bb.slice/crio-ba1ec086346b572ddfabc4aa6e273c5d93e9456d348236d55c67982fd8df34a5 WatchSource:0}: Error finding container ba1ec086346b572ddfabc4aa6e273c5d93e9456d348236d55c67982fd8df34a5: Status 404 returned error can't find the container with id ba1ec086346b572ddfabc4aa6e273c5d93e9456d348236d55c67982fd8df34a5 Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.319097 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.323365 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:21.819165235 +0000 UTC m=+120.511986487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.323819 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.324108 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:21.824100156 +0000 UTC m=+120.516921418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.325049 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.326742 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dkj7\" (UniqueName: \"kubernetes.io/projected/b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11-kube-api-access-7dkj7\") pod \"cluster-samples-operator-665b6dd947-thbzv\" (UID: \"b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.331613 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.338602 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpm8r\" (UniqueName: \"kubernetes.io/projected/727a1c24-3564-4d03-910d-a1bda2a3667f-kube-api-access-lpm8r\") pod \"openshift-config-operator-7777fb866f-dnlqj\" (UID: \"727a1c24-3564-4d03-910d-a1bda2a3667f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.360455 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd8vw\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-kube-api-access-wd8vw\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.375079 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28l6r\" (UniqueName: \"kubernetes.io/projected/4f1dfc67-48c4-4c0d-9807-6b962a542d71-kube-api-access-28l6r\") pod \"etcd-operator-b45778765-9r92q\" (UID: \"4f1dfc67-48c4-4c0d-9807-6b962a542d71\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.390872 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.391195 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57"] Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.394371 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbfj9\" (UniqueName: \"kubernetes.io/projected/f8678dbc-4d5c-4081-af9f-b26869f98034-kube-api-access-lbfj9\") pod \"dns-default-qmlh7\" (UID: \"f8678dbc-4d5c-4081-af9f-b26869f98034\") " pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.395810 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.403124 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.416684 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzlj4\" (UniqueName: \"kubernetes.io/projected/9631d017-2723-4e2f-b285-c38333929edf-kube-api-access-fzlj4\") pod \"package-server-manager-789f6589d5-fkcd4\" (UID: \"9631d017-2723-4e2f-b285-c38333929edf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.418470 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.423216 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbs79"] Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.425556 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.425685 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:21.925663497 +0000 UTC m=+120.618484759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.425901 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.426275 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:21.926264333 +0000 UTC m=+120.619085605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.432274 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.441058 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28a73dbc-e661-4ceb-80cb-66a21c6895e1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nwvhh\" (UID: \"28a73dbc-e661-4ceb-80cb-66a21c6895e1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.445063 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.445670 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.450213 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.467564 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxwnz\" (UniqueName: \"kubernetes.io/projected/9999d995-4881-4349-ba17-89eef6722d16-kube-api-access-dxwnz\") pod \"csi-hostpathplugin-l67sz\" (UID: \"9999d995-4881-4349-ba17-89eef6722d16\") " pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.471910 4735 generic.go:334] "Generic (PLEG): container finished" podID="8b62c04d-523d-4858-bddf-5d2162392962" containerID="2bee6ff4d00e6184aafd6ddc1fd1cdabe58bf521bf16b905e5b0dcaab3659d97" exitCode=0 Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.472034 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" event={"ID":"8b62c04d-523d-4858-bddf-5d2162392962","Type":"ContainerDied","Data":"2bee6ff4d00e6184aafd6ddc1fd1cdabe58bf521bf16b905e5b0dcaab3659d97"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.472096 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" event={"ID":"8b62c04d-523d-4858-bddf-5d2162392962","Type":"ContainerStarted","Data":"27d455b21706c55942d5c1b3b46c997e79e5d77e6bd9a1922a598f8af3b73f77"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.476376 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" event={"ID":"33d4526f-3a59-40c2-b9c9-d93ced5dcd17","Type":"ContainerStarted","Data":"91cc6bd07bbe736aba235b9d99e3a28fe30e907e18cea26935b8a0b8732f7328"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.482657 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xql4l\" (UniqueName: \"kubernetes.io/projected/c3ecaab8-1253-4559-bc4f-76871e1edff8-kube-api-access-xql4l\") pod \"catalog-operator-68c6474976-t7rvp\" (UID: \"c3ecaab8-1253-4559-bc4f-76871e1edff8\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.488568 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" event={"ID":"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb","Type":"ContainerStarted","Data":"ba1ec086346b572ddfabc4aa6e273c5d93e9456d348236d55c67982fd8df34a5"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.492277 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.497474 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" event={"ID":"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42","Type":"ContainerStarted","Data":"440e77dff2ad886f8da794756f9c7ae126af19a310b656737dec85e6bdd8ad9e"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.499568 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qc826" event={"ID":"79bfc45c-3ae9-416b-9d4b-f56fab2387de","Type":"ContainerStarted","Data":"ef0005f73dcc07165ac45eec5a4a33f135911ee11182a4403acb7aea3e81199f"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.499608 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qc826" event={"ID":"79bfc45c-3ae9-416b-9d4b-f56fab2387de","Type":"ContainerStarted","Data":"2e437119120e225a1af2d79b6ce317db4be6686e161c474b6490f370d030a560"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.500577 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f22tm\" (UniqueName: \"kubernetes.io/projected/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-kube-api-access-f22tm\") pod \"collect-profiles-29321895-x8dfs\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.509256 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" event={"ID":"d604b23e-ed8c-486f-b7d5-5a5ddd308947","Type":"ContainerStarted","Data":"ed6898fb34b96da340f78c86273757ef5b076af059a3c68cf78af2621a239e21"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.509548 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" event={"ID":"d604b23e-ed8c-486f-b7d5-5a5ddd308947","Type":"ContainerStarted","Data":"255e42b188080997a5dea4e08ffe0b07a757383c89c3f1d2bb04affcee905698"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.510558 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.511554 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc4804a2-9d77-4820-bebd-93094a7143ec" containerID="0f921174eb734c3bd1a7be25f3db0eb00bfc8d2569fccf115f58bfceee149098" exitCode=0 Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.511944 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" event={"ID":"dc4804a2-9d77-4820-bebd-93094a7143ec","Type":"ContainerDied","Data":"0f921174eb734c3bd1a7be25f3db0eb00bfc8d2569fccf115f58bfceee149098"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.511958 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" event={"ID":"dc4804a2-9d77-4820-bebd-93094a7143ec","Type":"ContainerStarted","Data":"276f77d18147fc55c1116268ae6f8cc47b8fe0702c885a46becfa4d8a9c09407"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.519360 4735 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-625dj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.519413 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" podUID="d604b23e-ed8c-486f-b7d5-5a5ddd308947" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.520094 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9knlc\" (UniqueName: \"kubernetes.io/projected/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-kube-api-access-9knlc\") pod \"marketplace-operator-79b997595-cqpz2\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.521532 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.522881 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" event={"ID":"6963210d-abf6-43ad-80ea-72831b6d7504","Type":"ContainerStarted","Data":"318ac79466e9fe46b816c0b9e0fcf197f5a5f993da2b0068815005e39a420345"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.522913 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" event={"ID":"6963210d-abf6-43ad-80ea-72831b6d7504","Type":"ContainerStarted","Data":"970c4931b5ecbac8d41e093691a4ddf30962fd1ae7bcc17dd79c4153998a423f"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.522922 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" event={"ID":"6963210d-abf6-43ad-80ea-72831b6d7504","Type":"ContainerStarted","Data":"0b402daa0fcbbf3554b0bc40cd6ef046c6396156d4786b0cb9a4cebe69230c79"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.526490 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.526727 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dm5x5" event={"ID":"c8642ffb-52ec-44db-b2f8-33a1e98b5328","Type":"ContainerStarted","Data":"0a21f65d37284cc56de7b6b56302473d1a481b097eb30b5aae5ec549c05d4552"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.527150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dm5x5" event={"ID":"c8642ffb-52ec-44db-b2f8-33a1e98b5328","Type":"ContainerStarted","Data":"6776b949e7b030f39ff815a75fc865d5de4915218e48d36a1a5cff2f150684f0"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.527841 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dm5x5" Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.528419 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.028403739 +0000 UTC m=+120.721225001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.531840 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-l67sz" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.536662 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" event={"ID":"5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b","Type":"ContainerStarted","Data":"46f0fca6ec1033049c7fc2752e6b06b3f66872e3346d0117bf2cd9246e573cf8"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.537335 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-dm5x5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.537372 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dm5x5" podUID="c8642ffb-52ec-44db-b2f8-33a1e98b5328" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.542185 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9w8t\" (UniqueName: \"kubernetes.io/projected/7cdf3b9f-6a98-4b7c-9953-a8623f968d89-kube-api-access-j9w8t\") pod \"machine-config-operator-74547568cd-phjsf\" (UID: \"7cdf3b9f-6a98-4b7c-9953-a8623f968d89\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.544607 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" event={"ID":"5e4cc884-e7b2-4c28-9b45-475517f210a8","Type":"ContainerStarted","Data":"c7e1f9c6fc93dbe538f69f893b30d4c14e1fabf2a11e8af92bd1817069ba7f04"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.544660 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" event={"ID":"5e4cc884-e7b2-4c28-9b45-475517f210a8","Type":"ContainerStarted","Data":"962e5c6187e44193dd88fda1de954492cd483c27513c39f010ccc0a119c2b6c2"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.548539 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.552937 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.553576 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" event={"ID":"c3f0508a-69f7-42a8-b39e-58c8d6c12c51","Type":"ContainerStarted","Data":"a19c8a3eb48e3db272d2c591108e9de5613d60f49d7d2e4fc033f97109bc0998"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.553615 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" event={"ID":"c3f0508a-69f7-42a8-b39e-58c8d6c12c51","Type":"ContainerStarted","Data":"2171d1cbd9572193a4eb15427d16bd02f30b1418e2492d0cc7141772e0376aa9"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.556440 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpw5s\" (UniqueName: \"kubernetes.io/projected/4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447-kube-api-access-hpw5s\") pod \"machine-config-server-tg55d\" (UID: \"4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447\") " pod="openshift-machine-config-operator/machine-config-server-tg55d" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.561104 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sjt7w" event={"ID":"64029635-e98e-458d-ace4-275d8f40d34e","Type":"ContainerStarted","Data":"dbd0cfe9a435f0e0408892436f8b4f60bb2de7289d149b3f6b4df37b360e2482"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.561188 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sjt7w" event={"ID":"64029635-e98e-458d-ace4-275d8f40d34e","Type":"ContainerStarted","Data":"025247d811ab100a1e325b04e3e73dd06821fd126ac1bc564ba6029fbed46c39"} Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.564957 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.565047 4735 patch_prober.go:28] interesting pod/console-operator-58897d9998-sjt7w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.565076 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sjt7w" podUID="64029635-e98e-458d-ace4-275d8f40d34e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.580965 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5cnf\" (UniqueName: \"kubernetes.io/projected/56f4b38b-92d5-434f-8b9e-dff6c5cb054a-kube-api-access-b5cnf\") pod \"multus-admission-controller-857f4d67dd-2jcl8\" (UID: \"56f4b38b-92d5-434f-8b9e-dff6c5cb054a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.597946 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fswbb\" (UniqueName: \"kubernetes.io/projected/76458fc4-8f1e-462d-8a5c-1af31c52f7b6-kube-api-access-fswbb\") pod \"service-ca-9c57cc56f-xzlpn\" (UID: \"76458fc4-8f1e-462d-8a5c-1af31c52f7b6\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.624740 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncdsf\" (UniqueName: \"kubernetes.io/projected/81c0d8ce-d766-46bb-bd2b-79043b149331-kube-api-access-ncdsf\") pod \"ingress-canary-fkz86\" (UID: \"81c0d8ce-d766-46bb-bd2b-79043b149331\") " pod="openshift-ingress-canary/ingress-canary-fkz86" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.633747 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.635178 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.135163729 +0000 UTC m=+120.827985061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.638013 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.647256 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq9s7\" (UniqueName: \"kubernetes.io/projected/fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2-kube-api-access-mq9s7\") pod \"service-ca-operator-777779d784-cx2v2\" (UID: \"fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.660341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz7hh\" (UniqueName: \"kubernetes.io/projected/afb4417d-4b1d-4d1e-8724-a78b3781b4f1-kube-api-access-bz7hh\") pod \"migrator-59844c95c7-sngf8\" (UID: \"afb4417d-4b1d-4d1e-8724-a78b3781b4f1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.706741 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns"] Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.734773 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v"] Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.752584 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.753185 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.25317021 +0000 UTC m=+120.945991472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.760834 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.765822 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.770796 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.783919 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.784451 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.804324 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.811913 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.824526 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j"] Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.836575 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rs7ln"] Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.847348 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.854969 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tg55d" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.862046 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.862621 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.362608912 +0000 UTC m=+121.055430174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.881933 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.885602 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fkz86" Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.889838 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25"] Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.963216 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.963525 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.463443073 +0000 UTC m=+121.156264335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:21 crc kubenswrapper[4735]: I1001 10:19:21.963872 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:21 crc kubenswrapper[4735]: E1001 10:19:21.964215 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.464203002 +0000 UTC m=+121.157024254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.061655 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.069202 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.084488 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.084555 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.105438 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.605406305 +0000 UTC m=+121.298227567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.106815 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.107452 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.60743593 +0000 UTC m=+121.300257192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.213800 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.213938 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.713907292 +0000 UTC m=+121.406728554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.214745 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.215066 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.715054852 +0000 UTC m=+121.407876114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.243129 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv"] Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.277049 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh"] Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.286911 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xtfsg"] Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.302817 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc"] Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.316068 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.316361 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.816334405 +0000 UTC m=+121.509155707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.332355 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d4p82"] Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.352311 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dm5x5" podStartSLOduration=101.352282378 podStartE2EDuration="1m41.352282378s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:22.351204749 +0000 UTC m=+121.044026011" watchObservedRunningTime="2025-10-01 10:19:22.352282378 +0000 UTC m=+121.045103640" Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.391364 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-sjt7w" podStartSLOduration=101.391348584 podStartE2EDuration="1m41.391348584s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:22.390705597 +0000 UTC m=+121.083526859" watchObservedRunningTime="2025-10-01 10:19:22.391348584 +0000 UTC m=+121.084169846" Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.419097 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.419544 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:22.91952929 +0000 UTC m=+121.612350552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.522224 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.522377 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.022352364 +0000 UTC m=+121.715173626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.523570 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.523873 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.023862004 +0000 UTC m=+121.716683266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.616133 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" event={"ID":"6718bff7-4d68-4aa2-ad2b-1511e0799683","Type":"ContainerStarted","Data":"3cd54128a19b4a8fea5063b4ececd3e84921cc148f61cb26e80a75480bf4f83b"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.626151 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.626367 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.126336699 +0000 UTC m=+121.819157971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.626722 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.627623 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.127610133 +0000 UTC m=+121.820431405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.633696 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" event={"ID":"2245b41c-0ccc-47ae-92c8-84ecfd80c53e","Type":"ContainerStarted","Data":"4fcb00832dd9e9b384bb1deb18a321366ee2b5589a08c21c2070587857c5e4b0"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.660940 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" event={"ID":"28a73dbc-e661-4ceb-80cb-66a21c6895e1","Type":"ContainerStarted","Data":"09f0e6af7a0fe7b0a942a9ec5dd95312a2ded99c66c0e085ee171b68df904e48"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.690516 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj"] Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.728114 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.728527 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.228511436 +0000 UTC m=+121.921332698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.741767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" event={"ID":"8e03e157-1b07-41e4-80a2-c751ca26e2a7","Type":"ContainerStarted","Data":"ba490e2324925356aa02679574b2b7a3632a0126c9d3a10633a0eac23cf94c5b"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.741808 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" event={"ID":"8e03e157-1b07-41e4-80a2-c751ca26e2a7","Type":"ContainerStarted","Data":"2b35411076d7347cb76d543ded9b73f7b682c921c39a0fee483c54657d86baf2"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.752750 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xtfsg" event={"ID":"5e28238a-9bf2-4e10-827d-7350e0ec0150","Type":"ContainerStarted","Data":"d575abb946245bd204701aa04c8d83889b916e25ea86e731d250a799a4af2dd9"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.762309 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" event={"ID":"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42","Type":"ContainerStarted","Data":"395ae88116d6428813e74759bd7662afe5b08de23c11d76c04e3845b4f36004f"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.777067 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" event={"ID":"33d4526f-3a59-40c2-b9c9-d93ced5dcd17","Type":"ContainerStarted","Data":"365bfd54a4c148b4689959da3daebc3babb346df30559df04e66153cfee16816"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.786263 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" event={"ID":"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb","Type":"ContainerStarted","Data":"300492345a1ea1d126dbe57a7b4a0de45200683b526e200c5a2acd7dbcb0d263"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.786329 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" event={"ID":"4f6a88e2-4257-4ed0-9381-6dcd9d5647bb","Type":"ContainerStarted","Data":"81b953a03084d553ea47ebbc5236c471e2eee09d8108cefbec72057c0e922652"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.800827 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" event={"ID":"dc4804a2-9d77-4820-bebd-93094a7143ec","Type":"ContainerStarted","Data":"8c744c8fd8fe60746c6ba87617b9c37670d8892d41e5087f868264e9166e68e0"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.812330 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" event={"ID":"5e814370-2cf9-4f05-ba2a-cf771d55529f","Type":"ContainerStarted","Data":"ff292f7a226aa46a0bb59c0b004b8b29c7ff985dae98e4570b5a191cd9bbb1db"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.814407 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" event={"ID":"bf1e6345-2675-413e-bd53-d456e57b08bd","Type":"ContainerStarted","Data":"1f9e2fa9e342bdbb074acae0e27f096f6a2f5bcd3bae1ed067cf1e4ec58a1b32"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.814466 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" event={"ID":"bf1e6345-2675-413e-bd53-d456e57b08bd","Type":"ContainerStarted","Data":"d34ae32d5f39d406d414c6692dcccd73354c2ed3f897262a19805b7e5d30cfb2"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.815785 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.824198 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" event={"ID":"5afd3b9d-a9e2-4a51-9e1f-fcebffc8215b","Type":"ContainerStarted","Data":"0324932f3fe16ebd62f2a92509b6b6c9b34bccef322ca0b86a2e241898a2e49d"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.835069 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.835615 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.335600245 +0000 UTC m=+122.028421507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.850322 4735 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-wbs79 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.850372 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" podUID="bf1e6345-2675-413e-bd53-d456e57b08bd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.850393 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" event={"ID":"1ba38415-c2e4-4550-a610-40bcee6d1323","Type":"ContainerStarted","Data":"c5b67507f29204d64587230ee870b9fa534ff04124993f6edb2cbe5ac838e235"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.863188 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tg55d" event={"ID":"4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447","Type":"ContainerStarted","Data":"c23992ff5c67070e381a62ed163cfebb4330883d0510b0c8065f1a5f3234079c"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.877946 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" event={"ID":"70b90ab7-92ad-43c7-9796-15ec49caae3e","Type":"ContainerStarted","Data":"10af5117a72be2038c72ae37af0543ce92053fe8f1a201eb8afb57e367b79d7a"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.881246 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns" event={"ID":"1d4a525c-7b1a-4bde-976a-d4b938c27209","Type":"ContainerStarted","Data":"bbfc98a16b43c35aaae1b328ef4e74ddd12b5f024e50d5543a5caafcf93878d0"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.886346 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" event={"ID":"a862cad8-e4de-41c5-a83b-734574bc958e","Type":"ContainerStarted","Data":"7fa0c6b1ff770770c0ffbf07bc4e4dbf838c98d1c0ef65db05c5777fbc71202b"} Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.899192 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-dm5x5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.899232 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dm5x5" podUID="c8642ffb-52ec-44db-b2f8-33a1e98b5328" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.921316 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.933908 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gbmw6" podStartSLOduration=100.933881407 podStartE2EDuration="1m40.933881407s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:22.89817131 +0000 UTC m=+121.590992562" watchObservedRunningTime="2025-10-01 10:19:22.933881407 +0000 UTC m=+121.626702679" Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.947163 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:22 crc kubenswrapper[4735]: E1001 10:19:22.948615 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.448596811 +0000 UTC m=+122.141418083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:22 crc kubenswrapper[4735]: I1001 10:19:22.963527 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" podStartSLOduration=100.96349065 podStartE2EDuration="1m40.96349065s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:22.960088219 +0000 UTC m=+121.652909481" watchObservedRunningTime="2025-10-01 10:19:22.96349065 +0000 UTC m=+121.656311912" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.011760 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-qc826" podStartSLOduration=101.011740322 podStartE2EDuration="1m41.011740322s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:23.008333841 +0000 UTC m=+121.701155093" watchObservedRunningTime="2025-10-01 10:19:23.011740322 +0000 UTC m=+121.704561584" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.012574 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qmlh7"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.049851 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.055389 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sksft"] Oct 01 10:19:23 crc kubenswrapper[4735]: E1001 10:19:23.060632 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.560615451 +0000 UTC m=+122.253436793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.064921 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.071298 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:23 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:23 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:23 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.071352 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.088057 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-l67sz"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.091958 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.092287 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-kgr86" podStartSLOduration=101.09227719 podStartE2EDuration="1m41.09227719s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:23.080010151 +0000 UTC m=+121.772831413" watchObservedRunningTime="2025-10-01 10:19:23.09227719 +0000 UTC m=+121.785098452" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.116453 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.143706 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-sjt7w" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.151841 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:23 crc kubenswrapper[4735]: E1001 10:19:23.152141 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.652127303 +0000 UTC m=+122.344948555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.153168 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.157201 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kkmzc" podStartSLOduration=101.157184098 podStartE2EDuration="1m41.157184098s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:23.156929462 +0000 UTC m=+121.849750734" watchObservedRunningTime="2025-10-01 10:19:23.157184098 +0000 UTC m=+121.850005360" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.168874 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xzlpn"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.186288 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9r92q"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.232694 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kqslj" podStartSLOduration=101.23266023 podStartE2EDuration="1m41.23266023s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:23.212544272 +0000 UTC m=+121.905365534" watchObservedRunningTime="2025-10-01 10:19:23.23266023 +0000 UTC m=+121.925481492" Oct 01 10:19:23 crc kubenswrapper[4735]: W1001 10:19:23.249017 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76458fc4_8f1e_462d_8a5c_1af31c52f7b6.slice/crio-3d31f0b9d3fc7fdf7256efcd7082d87670b000ec61e9c266c7842fd7f1b1e90f WatchSource:0}: Error finding container 3d31f0b9d3fc7fdf7256efcd7082d87670b000ec61e9c266c7842fd7f1b1e90f: Status 404 returned error can't find the container with id 3d31f0b9d3fc7fdf7256efcd7082d87670b000ec61e9c266c7842fd7f1b1e90f Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.253112 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:23 crc kubenswrapper[4735]: E1001 10:19:23.253643 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.753630262 +0000 UTC m=+122.446451524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.299626 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2jcl8"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.301093 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.335557 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.360632 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:23 crc kubenswrapper[4735]: E1001 10:19:23.361015 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.861001348 +0000 UTC m=+122.553822600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.371433 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fkz86"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.397239 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2"] Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.442657 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqpz2"] Oct 01 10:19:23 crc kubenswrapper[4735]: W1001 10:19:23.451606 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc0f68ff_8c8c_40f6_bd13_e0bec87fb9f2.slice/crio-8cc64dd9907e86366519d11a78c2bbc22865281818a3b5be7c1bdb1365fcaa63 WatchSource:0}: Error finding container 8cc64dd9907e86366519d11a78c2bbc22865281818a3b5be7c1bdb1365fcaa63: Status 404 returned error can't find the container with id 8cc64dd9907e86366519d11a78c2bbc22865281818a3b5be7c1bdb1365fcaa63 Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.462570 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:23 crc kubenswrapper[4735]: E1001 10:19:23.463043 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:23.963006951 +0000 UTC m=+122.655828213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:23 crc kubenswrapper[4735]: W1001 10:19:23.466882 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cdf3b9f_6a98_4b7c_9953_a8623f968d89.slice/crio-d254df0114c64785d824f15c11eb70e889e9fe19ce875714f0480a74d9c978fb WatchSource:0}: Error finding container d254df0114c64785d824f15c11eb70e889e9fe19ce875714f0480a74d9c978fb: Status 404 returned error can't find the container with id d254df0114c64785d824f15c11eb70e889e9fe19ce875714f0480a74d9c978fb Oct 01 10:19:23 crc kubenswrapper[4735]: W1001 10:19:23.477412 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c0d8ce_d766_46bb_bd2b_79043b149331.slice/crio-9b07c3d4c4274ebbf19748732b6bf76410754698fd0f26f5587ab04898fe56c4 WatchSource:0}: Error finding container 9b07c3d4c4274ebbf19748732b6bf76410754698fd0f26f5587ab04898fe56c4: Status 404 returned error can't find the container with id 9b07c3d4c4274ebbf19748732b6bf76410754698fd0f26f5587ab04898fe56c4 Oct 01 10:19:23 crc kubenswrapper[4735]: W1001 10:19:23.505155 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb79ac2f9_c8bc_4893_ace2_ca598d77ff52.slice/crio-da1327489f1e1b175d745a062350fa6226ded86cd48081d7806da155d1c51ecd WatchSource:0}: Error finding container da1327489f1e1b175d745a062350fa6226ded86cd48081d7806da155d1c51ecd: Status 404 returned error can't find the container with id da1327489f1e1b175d745a062350fa6226ded86cd48081d7806da155d1c51ecd Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.563113 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:23 crc kubenswrapper[4735]: E1001 10:19:23.563764 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:24.06374778 +0000 UTC m=+122.756569042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.597210 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns" podStartSLOduration=101.597192116 podStartE2EDuration="1m41.597192116s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:23.596469026 +0000 UTC m=+122.289290298" watchObservedRunningTime="2025-10-01 10:19:23.597192116 +0000 UTC m=+122.290013378" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.666615 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:23 crc kubenswrapper[4735]: E1001 10:19:23.666920 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:24.166908752 +0000 UTC m=+122.859730014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.683928 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8dkrf" podStartSLOduration=101.683912849 podStartE2EDuration="1m41.683912849s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:23.64404966 +0000 UTC m=+122.336870922" watchObservedRunningTime="2025-10-01 10:19:23.683912849 +0000 UTC m=+122.376734111" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.684031 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-tg55d" podStartSLOduration=5.684025602 podStartE2EDuration="5.684025602s" podCreationTimestamp="2025-10-01 10:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:23.682567613 +0000 UTC m=+122.375388885" watchObservedRunningTime="2025-10-01 10:19:23.684025602 +0000 UTC m=+122.376846864" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.766636 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" podStartSLOduration=102.766616944 podStartE2EDuration="1m42.766616944s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:23.764129797 +0000 UTC m=+122.456951059" watchObservedRunningTime="2025-10-01 10:19:23.766616944 +0000 UTC m=+122.459438206" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.777600 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:23 crc kubenswrapper[4735]: E1001 10:19:23.777910 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:24.277894776 +0000 UTC m=+122.970716038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.795485 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7962v" podStartSLOduration=101.795462756 podStartE2EDuration="1m41.795462756s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:23.793844743 +0000 UTC m=+122.486666005" watchObservedRunningTime="2025-10-01 10:19:23.795462756 +0000 UTC m=+122.488284018" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.879079 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:23 crc kubenswrapper[4735]: E1001 10:19:23.879457 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:24.379443446 +0000 UTC m=+123.072264708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.883344 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" podStartSLOduration=101.8833216 podStartE2EDuration="1m41.8833216s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:23.88184206 +0000 UTC m=+122.574663332" watchObservedRunningTime="2025-10-01 10:19:23.8833216 +0000 UTC m=+122.576142862" Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.984050 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:23 crc kubenswrapper[4735]: E1001 10:19:23.984672 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:24.484657494 +0000 UTC m=+123.177478746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:23 crc kubenswrapper[4735]: I1001 10:19:23.997111 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" event={"ID":"56f4b38b-92d5-434f-8b9e-dff6c5cb054a","Type":"ContainerStarted","Data":"5b20edd36385d105864ac07e9367c6949ff21d879198525df6b3d062f965d295"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.008867 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" event={"ID":"b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11","Type":"ContainerStarted","Data":"244c7308e31fa6d1e88f909b6ef9736cf977f3d7bde6c0c32e720ecf063f94ca"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.062593 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:24 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:24 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:24 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.062687 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.066234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" event={"ID":"c3ecaab8-1253-4559-bc4f-76871e1edff8","Type":"ContainerStarted","Data":"1355140d7b6d7c41cf220c4472348d27d3aef285c5e0e832e405359c1a6fe9cd"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.081994 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" event={"ID":"a862cad8-e4de-41c5-a83b-734574bc958e","Type":"ContainerStarted","Data":"5d30f74130a86cbd597bc61f1dec51638cbc60dc5487d1bfe92d6e1bb4af7178"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.084021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" event={"ID":"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39","Type":"ContainerStarted","Data":"c2bc6bdc2b2d7708ce412532abde428d99bc1b06336686d6c59270c362cea21b"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.085822 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:24 crc kubenswrapper[4735]: E1001 10:19:24.086117 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:24.586106132 +0000 UTC m=+123.278927394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.107713 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" event={"ID":"70b90ab7-92ad-43c7-9796-15ec49caae3e","Type":"ContainerStarted","Data":"4ec412ec507ee4743484512687974cfc9bf38227becf7a9d8d29658241556197"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.108514 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.135083 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-hwt8j" podStartSLOduration=103.135062693 podStartE2EDuration="1m43.135062693s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.133263715 +0000 UTC m=+122.826084977" watchObservedRunningTime="2025-10-01 10:19:24.135062693 +0000 UTC m=+122.827883955" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.136583 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" event={"ID":"fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2","Type":"ContainerStarted","Data":"8cc64dd9907e86366519d11a78c2bbc22865281818a3b5be7c1bdb1365fcaa63"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.137078 4735 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-hvc25 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.137133 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" podUID="70b90ab7-92ad-43c7-9796-15ec49caae3e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.152277 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" event={"ID":"5e814370-2cf9-4f05-ba2a-cf771d55529f","Type":"ContainerStarted","Data":"e7edf84945b019163e13f9984151e551cc585c39acb3d940cb5c6d8bbae7a4e2"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.167449 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" podStartSLOduration=102.16743243 podStartE2EDuration="1m42.16743243s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.164733208 +0000 UTC m=+122.857554470" watchObservedRunningTime="2025-10-01 10:19:24.16743243 +0000 UTC m=+122.860253692" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.191573 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:24 crc kubenswrapper[4735]: E1001 10:19:24.192695 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:24.692672696 +0000 UTC m=+123.385494018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.195481 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" event={"ID":"2236757b-1fd9-4aea-9b6d-e5f56ae7ff42","Type":"ContainerStarted","Data":"d5fc575091add4967a90cb8da461203087c58d5122cdbb029c3069aac7fb0294"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.225872 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" event={"ID":"6718bff7-4d68-4aa2-ad2b-1511e0799683","Type":"ContainerStarted","Data":"7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.226641 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.251025 4735 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-rs7ln container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.31:6443/healthz\": dial tcp 10.217.0.31:6443: connect: connection refused" start-of-body= Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.251071 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" podUID="6718bff7-4d68-4aa2-ad2b-1511e0799683" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.31:6443/healthz\": dial tcp 10.217.0.31:6443: connect: connection refused" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.251484 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" event={"ID":"76458fc4-8f1e-462d-8a5c-1af31c52f7b6","Type":"ContainerStarted","Data":"3d31f0b9d3fc7fdf7256efcd7082d87670b000ec61e9c266c7842fd7f1b1e90f"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.286098 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" podStartSLOduration=102.286073798 podStartE2EDuration="1m42.286073798s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.214606594 +0000 UTC m=+122.907427856" watchObservedRunningTime="2025-10-01 10:19:24.286073798 +0000 UTC m=+122.978895060" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.303330 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:24 crc kubenswrapper[4735]: E1001 10:19:24.304967 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:24.804955874 +0000 UTC m=+123.497777136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.308709 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" event={"ID":"2245b41c-0ccc-47ae-92c8-84ecfd80c53e","Type":"ContainerStarted","Data":"5ed34d6f3c2687240b0ab53499433f269389471acb4b80dedb1e307cf6747b77"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.309881 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.324109 4735 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fxcpc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" start-of-body= Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.324268 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" podUID="2245b41c-0ccc-47ae-92c8-84ecfd80c53e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.357900 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" event={"ID":"9631d017-2723-4e2f-b285-c38333929edf","Type":"ContainerStarted","Data":"5027ce56bd100719f64d1e341f49944c264fccd8624bca905e3f64901e58f4b1"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.357948 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" event={"ID":"9631d017-2723-4e2f-b285-c38333929edf","Type":"ContainerStarted","Data":"fad350eb983ba2220c35c61ccb3d79d8b3d5d4c5728194db7b37f3c5432b1a3f"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.375455 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-279pq" podStartSLOduration=104.375439651 podStartE2EDuration="1m44.375439651s" podCreationTimestamp="2025-10-01 10:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.27979384 +0000 UTC m=+122.972615102" watchObservedRunningTime="2025-10-01 10:19:24.375439651 +0000 UTC m=+123.068260913" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.376192 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" podStartSLOduration=103.376184012 podStartE2EDuration="1m43.376184012s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.373331955 +0000 UTC m=+123.066153217" watchObservedRunningTime="2025-10-01 10:19:24.376184012 +0000 UTC m=+123.069005274" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.387346 4735 generic.go:334] "Generic (PLEG): container finished" podID="727a1c24-3564-4d03-910d-a1bda2a3667f" containerID="62f8dbdea3c732a2057659c5c71c3a486f5cd53752d18a80a097f9e3c2481299" exitCode=0 Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.387421 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" event={"ID":"727a1c24-3564-4d03-910d-a1bda2a3667f","Type":"ContainerDied","Data":"62f8dbdea3c732a2057659c5c71c3a486f5cd53752d18a80a097f9e3c2481299"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.387448 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" event={"ID":"727a1c24-3564-4d03-910d-a1bda2a3667f","Type":"ContainerStarted","Data":"749ed06e269dda9250c1eeed1a13297b2434b9fc5f09fc5bb0495b1376761db7"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.406324 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:24 crc kubenswrapper[4735]: E1001 10:19:24.406642 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:24.906617997 +0000 UTC m=+123.599439259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.415411 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" event={"ID":"33d4526f-3a59-40c2-b9c9-d93ced5dcd17","Type":"ContainerStarted","Data":"34b80bcf247984d0aaecdfa0f3faa8f407a23f4d00341ee632d45777d52c303f"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.444694 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" event={"ID":"1ba38415-c2e4-4550-a610-40bcee6d1323","Type":"ContainerStarted","Data":"3ee4012ffc70f5476e0e36bfbbc3a377c04e7b22be8b3cafb479427db4d53883"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.479916 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" event={"ID":"8b62c04d-523d-4858-bddf-5d2162392962","Type":"ContainerStarted","Data":"ef38186855b30aa35ed50eb8596a3a98f13396ec69d2cf270eff820e40505ac7"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.479960 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" event={"ID":"8b62c04d-523d-4858-bddf-5d2162392962","Type":"ContainerStarted","Data":"b8627c89164343bc705ef2fa7ce0529df0096d952824fbf5b38fa745d1ec5d8c"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.495718 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l67sz" event={"ID":"9999d995-4881-4349-ba17-89eef6722d16","Type":"ContainerStarted","Data":"f87d3bb536d72cde7090f90eb5b45d4ac8e1f72e0e0f210120f9e0eef7d6bcc1"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.508511 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:24 crc kubenswrapper[4735]: E1001 10:19:24.508932 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.008920397 +0000 UTC m=+123.701741659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.520432 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rcv57" podStartSLOduration=102.520413045 podStartE2EDuration="1m42.520413045s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.519599173 +0000 UTC m=+123.212420445" watchObservedRunningTime="2025-10-01 10:19:24.520413045 +0000 UTC m=+123.213234307" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.521850 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8" event={"ID":"afb4417d-4b1d-4d1e-8724-a78b3781b4f1","Type":"ContainerStarted","Data":"c77da4a869910e82e9372837be08c0615d8cc293006283b5f236746acc31bf80"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.529426 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" podStartSLOduration=102.529404706 podStartE2EDuration="1m42.529404706s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.429581322 +0000 UTC m=+123.122402584" watchObservedRunningTime="2025-10-01 10:19:24.529404706 +0000 UTC m=+123.222225968" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.533507 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" event={"ID":"7cdf3b9f-6a98-4b7c-9953-a8623f968d89","Type":"ContainerStarted","Data":"d254df0114c64785d824f15c11eb70e889e9fe19ce875714f0480a74d9c978fb"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.554341 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" podStartSLOduration=102.554318073 podStartE2EDuration="1m42.554318073s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.551166359 +0000 UTC m=+123.243987621" watchObservedRunningTime="2025-10-01 10:19:24.554318073 +0000 UTC m=+123.247139335" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.590227 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" event={"ID":"b79ac2f9-c8bc-4893-ace2-ca598d77ff52","Type":"ContainerStarted","Data":"da1327489f1e1b175d745a062350fa6226ded86cd48081d7806da155d1c51ecd"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.590941 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.600778 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tg55d" event={"ID":"4d1aa0d7-e9f4-48eb-86ef-ec9c9451a447","Type":"ContainerStarted","Data":"4e88d0fe2824faa2efd21fe36ee2b906e9fabdfd4dc390cf5eb5859ab26625ec"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.607509 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cqpz2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.607559 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" podUID="b79ac2f9-c8bc-4893-ace2-ca598d77ff52" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.613831 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:24 crc kubenswrapper[4735]: E1001 10:19:24.619278 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.119250483 +0000 UTC m=+123.812071745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.631180 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:24 crc kubenswrapper[4735]: E1001 10:19:24.633152 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.133131134 +0000 UTC m=+123.825952396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.656677 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-76rns" event={"ID":"1d4a525c-7b1a-4bde-976a-d4b938c27209","Type":"ContainerStarted","Data":"8a03b239efaa310908f829754012405bff18bf6754ea32bc5720a70e62c4d8d7"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.716427 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" event={"ID":"97d879b1-ea51-444b-8c31-5dc89c3a1fb2","Type":"ContainerStarted","Data":"e0e5752b2a5f73260294f456ab4cec7b6420fb8716b7ecf4f736d09cd5443996"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.716482 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" event={"ID":"97d879b1-ea51-444b-8c31-5dc89c3a1fb2","Type":"ContainerStarted","Data":"215e09a4591f19288e40f6fec3e674ea163423564b2178936ba0ba0cd9976444"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.734551 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-drrzv" podStartSLOduration=102.734530081 podStartE2EDuration="1m42.734530081s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.654324993 +0000 UTC m=+123.347146255" watchObservedRunningTime="2025-10-01 10:19:24.734530081 +0000 UTC m=+123.427351343" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.735362 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.735409 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" podStartSLOduration=103.735404334 podStartE2EDuration="1m43.735404334s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.734989773 +0000 UTC m=+123.427811035" watchObservedRunningTime="2025-10-01 10:19:24.735404334 +0000 UTC m=+123.428225596" Oct 01 10:19:24 crc kubenswrapper[4735]: E1001 10:19:24.735851 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.235836056 +0000 UTC m=+123.928657318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.755079 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" event={"ID":"28a73dbc-e661-4ceb-80cb-66a21c6895e1","Type":"ContainerStarted","Data":"2245124f770fd5ed7b08bee5a21dfe32987cfb160664c1e19f701061aa5ff0f2"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.774422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qmlh7" event={"ID":"f8678dbc-4d5c-4081-af9f-b26869f98034","Type":"ContainerStarted","Data":"b5f8fd9007843abbc7f596b7460cf5e78b97ce518b59688522aefbb14058bbb3"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.774484 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qmlh7" event={"ID":"f8678dbc-4d5c-4081-af9f-b26869f98034","Type":"ContainerStarted","Data":"3a67fc70e6218bf42fe268d4c7b3506b7361c198014d120376e5d0e2bbbfdca9"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.777455 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xtfsg" event={"ID":"5e28238a-9bf2-4e10-827d-7350e0ec0150","Type":"ContainerStarted","Data":"ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.778968 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" event={"ID":"4f1dfc67-48c4-4c0d-9807-6b962a542d71","Type":"ContainerStarted","Data":"4548089aa97d4a9fef428c3c048be7334eb5ced1dd5e8c5cbf3b3d5d68b18c19"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.785313 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fkz86" event={"ID":"81c0d8ce-d766-46bb-bd2b-79043b149331","Type":"ContainerStarted","Data":"9b07c3d4c4274ebbf19748732b6bf76410754698fd0f26f5587ab04898fe56c4"} Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.791650 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-dm5x5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.791702 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dm5x5" podUID="c8642ffb-52ec-44db-b2f8-33a1e98b5328" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.807220 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.832237 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" podStartSLOduration=102.832222228 podStartE2EDuration="1m42.832222228s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.774377847 +0000 UTC m=+123.467199109" watchObservedRunningTime="2025-10-01 10:19:24.832222228 +0000 UTC m=+123.525043480" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.840806 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:24 crc kubenswrapper[4735]: E1001 10:19:24.851616 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.351588287 +0000 UTC m=+124.044409629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.898144 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-sksft" podStartSLOduration=103.898130473 podStartE2EDuration="1m43.898130473s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.896696605 +0000 UTC m=+123.589517857" watchObservedRunningTime="2025-10-01 10:19:24.898130473 +0000 UTC m=+123.590951735" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.898838 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwvhh" podStartSLOduration=102.898833421 podStartE2EDuration="1m42.898833421s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.841092905 +0000 UTC m=+123.533914157" watchObservedRunningTime="2025-10-01 10:19:24.898833421 +0000 UTC m=+123.591654683" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.944262 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:24 crc kubenswrapper[4735]: E1001 10:19:24.944605 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.444579127 +0000 UTC m=+124.137400389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.973725 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fkz86" podStartSLOduration=6.973706237 podStartE2EDuration="6.973706237s" podCreationTimestamp="2025-10-01 10:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.972715921 +0000 UTC m=+123.665537183" watchObservedRunningTime="2025-10-01 10:19:24.973706237 +0000 UTC m=+123.666527499" Oct 01 10:19:24 crc kubenswrapper[4735]: I1001 10:19:24.974821 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-xtfsg" podStartSLOduration=103.974801787 podStartE2EDuration="1m43.974801787s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:24.943835107 +0000 UTC m=+123.636656369" watchObservedRunningTime="2025-10-01 10:19:24.974801787 +0000 UTC m=+123.667623049" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.050441 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.050956 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.550940646 +0000 UTC m=+124.243761908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.062599 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:25 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:25 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:25 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.062697 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.065416 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" podStartSLOduration=103.065404213 podStartE2EDuration="1m43.065404213s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:25.064985462 +0000 UTC m=+123.757806734" watchObservedRunningTime="2025-10-01 10:19:25.065404213 +0000 UTC m=+123.758225475" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.151789 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.152010 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.651986043 +0000 UTC m=+124.344807305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.152145 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.152514 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.652486386 +0000 UTC m=+124.345307648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.253483 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.253598 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.753569284 +0000 UTC m=+124.446390546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.253706 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.254016 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.754004256 +0000 UTC m=+124.446825518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.354541 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.354718 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.854693763 +0000 UTC m=+124.547515025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.355327 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.355921 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.855907305 +0000 UTC m=+124.548728567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.456909 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.457251 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:25.957235609 +0000 UTC m=+124.650056871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.514295 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.514502 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.527060 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.558794 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.559099 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.059084958 +0000 UTC m=+124.751906220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.574728 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.574888 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.659471 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.659628 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.159603091 +0000 UTC m=+124.852424353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.659720 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.660127 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.160116074 +0000 UTC m=+124.852937416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.760887 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.761129 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.261101759 +0000 UTC m=+124.953923021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.761271 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.761673 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.261657094 +0000 UTC m=+124.954478346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.791520 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qmlh7" event={"ID":"f8678dbc-4d5c-4081-af9f-b26869f98034","Type":"ContainerStarted","Data":"cdbaeae3bda8eba9f7ee2b6de1464b324c73e08c6d5f2a6cfe9a775af12719e5"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.791638 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.793219 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8" event={"ID":"afb4417d-4b1d-4d1e-8724-a78b3781b4f1","Type":"ContainerStarted","Data":"e551add06122dc853c6f679d57ed4b89b8cc9eab173e8b83d695c61b2c0876fa"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.793266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8" event={"ID":"afb4417d-4b1d-4d1e-8724-a78b3781b4f1","Type":"ContainerStarted","Data":"c05eab6f8e367cb9a02939e6949ce705a81d34c7703a9fad352c98a9e7906cea"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.794388 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fkz86" event={"ID":"81c0d8ce-d766-46bb-bd2b-79043b149331","Type":"ContainerStarted","Data":"6fe0e14b530be9622df78be8c9b8626a3b8126016b9524a059004de883fd3504"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.795744 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l67sz" event={"ID":"9999d995-4881-4349-ba17-89eef6722d16","Type":"ContainerStarted","Data":"658480ac2bf5eb8e637239f3f03a8c0e5cd3d91f61d18cb63b2528938087b3dd"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.796838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" event={"ID":"9631d017-2723-4e2f-b285-c38333929edf","Type":"ContainerStarted","Data":"cca3d456119a562724686d855ba3b6442fcc00a14a5611f401908462116e2706"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.797187 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.798251 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" event={"ID":"c3ecaab8-1253-4559-bc4f-76871e1edff8","Type":"ContainerStarted","Data":"c7beab099cba4c883fbf4d1d12e39916326452a215cad446702d234676f3fbb2"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.798777 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.800610 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" event={"ID":"5e814370-2cf9-4f05-ba2a-cf771d55529f","Type":"ContainerStarted","Data":"8479768fd597cbdec48202a762da3bee50e68336c337ed9d8d6a152e745baff5"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.801972 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9r92q" event={"ID":"4f1dfc67-48c4-4c0d-9807-6b962a542d71","Type":"ContainerStarted","Data":"ac00868d8019944ae36ef03eb006e95138801fa4efd92c396ea9e661c4756420"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.806385 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" event={"ID":"b79ac2f9-c8bc-4893-ace2-ca598d77ff52","Type":"ContainerStarted","Data":"8e45bfb355a4b3adea101c5d6b0e92f94f799dea4d3be26d4ef2eed8c62dc1c9"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.807308 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cqpz2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.807356 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" podUID="b79ac2f9-c8bc-4893-ace2-ca598d77ff52" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.808482 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cx2v2" event={"ID":"fc0f68ff-8c8c-40f6-bd13-e0bec87fb9f2","Type":"ContainerStarted","Data":"c1bafb00d8a3f348703057a34bb35c6876e6a612f7b0d6a8cf1a17181f096249"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.810633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" event={"ID":"56f4b38b-92d5-434f-8b9e-dff6c5cb054a","Type":"ContainerStarted","Data":"024e54bc9953757cdf308579219242785420eec56a5771753667c4a27f9fc083"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.810658 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" event={"ID":"56f4b38b-92d5-434f-8b9e-dff6c5cb054a","Type":"ContainerStarted","Data":"9e4d45dc4d45540826bfcf3017f447c8308fc3a53d44ec823fe0ed7b179c6fbf"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.815723 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.818048 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xzlpn" event={"ID":"76458fc4-8f1e-462d-8a5c-1af31c52f7b6","Type":"ContainerStarted","Data":"1a28d4645244c7c45cccc95ed6f7a5c54a115fb91b659c84bcfc098edfdfebc5"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.822193 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" event={"ID":"727a1c24-3564-4d03-910d-a1bda2a3667f","Type":"ContainerStarted","Data":"3baba9886705d80ce03a7dd9cb7ddc1778fa3b4b4ad061c750604ae2b008a9ad"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.822354 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.824097 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" event={"ID":"7cdf3b9f-6a98-4b7c-9953-a8623f968d89","Type":"ContainerStarted","Data":"258f784f4c7f321e0b9e35553673b906a8bd69c89f2f371bb071ed1ab0e97c75"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.824123 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" event={"ID":"7cdf3b9f-6a98-4b7c-9953-a8623f968d89","Type":"ContainerStarted","Data":"af64c2887090bf107bb0f471484ccaa568839087840acd948f43447c3c0cfeca"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.825793 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" event={"ID":"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39","Type":"ContainerStarted","Data":"87713f8cabf71727c4602948dc62159caee081a69a26b407259c0af4fadd7e48"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.827303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" event={"ID":"b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11","Type":"ContainerStarted","Data":"5fb533e5e77ff1de933c09820a79f220adbec42ceae34112d423453cd3a2faf4"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.827341 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" event={"ID":"b5fb857f-fab1-49d9-9c0b-a9a8bcf1dd11","Type":"ContainerStarted","Data":"adad00f0da6f9c07db6ebcebff943dc8fd6875db65ab20019f237974e186f261"} Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.840701 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7nkv" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.850943 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hvc25" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.852089 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.862787 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.863761 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.363741288 +0000 UTC m=+125.056562630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.864305 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.867037 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.367018997 +0000 UTC m=+125.059840349 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.957865 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qmlh7" podStartSLOduration=7.95784943 podStartE2EDuration="7.95784943s" podCreationTimestamp="2025-10-01 10:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:25.938768308 +0000 UTC m=+124.631589590" watchObservedRunningTime="2025-10-01 10:19:25.95784943 +0000 UTC m=+124.650670692" Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.971958 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:25 crc kubenswrapper[4735]: E1001 10:19:25.973203 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.473185011 +0000 UTC m=+125.166006283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:25 crc kubenswrapper[4735]: I1001 10:19:25.999806 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-phjsf" podStartSLOduration=103.999786203 podStartE2EDuration="1m43.999786203s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:25.997781859 +0000 UTC m=+124.690603121" watchObservedRunningTime="2025-10-01 10:19:25.999786203 +0000 UTC m=+124.692607455" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.056807 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" podStartSLOduration=105.056791929 podStartE2EDuration="1m45.056791929s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:26.053175573 +0000 UTC m=+124.745996835" watchObservedRunningTime="2025-10-01 10:19:26.056791929 +0000 UTC m=+124.749613191" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.057809 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:26 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:26 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:26 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.057864 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.073825 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:26 crc kubenswrapper[4735]: E1001 10:19:26.074265 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.574247358 +0000 UTC m=+125.267068690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.169606 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-thbzv" podStartSLOduration=105.169584971 podStartE2EDuration="1m45.169584971s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:26.148041324 +0000 UTC m=+124.840862586" watchObservedRunningTime="2025-10-01 10:19:26.169584971 +0000 UTC m=+124.862406233" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.175455 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:26 crc kubenswrapper[4735]: E1001 10:19:26.175667 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.675635633 +0000 UTC m=+125.368456895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.175826 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:26 crc kubenswrapper[4735]: E1001 10:19:26.176182 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.676170828 +0000 UTC m=+125.368992140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.220992 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sngf8" podStartSLOduration=104.220970908 podStartE2EDuration="1m44.220970908s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:26.220361361 +0000 UTC m=+124.913182633" watchObservedRunningTime="2025-10-01 10:19:26.220970908 +0000 UTC m=+124.913792170" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.222804 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2jcl8" podStartSLOduration=104.222792317 podStartE2EDuration="1m44.222792317s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:26.168975034 +0000 UTC m=+124.861796296" watchObservedRunningTime="2025-10-01 10:19:26.222792317 +0000 UTC m=+124.915613579" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.276896 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:26 crc kubenswrapper[4735]: E1001 10:19:26.277376 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.777358348 +0000 UTC m=+125.470179610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.380235 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:26 crc kubenswrapper[4735]: E1001 10:19:26.380660 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.880640305 +0000 UTC m=+125.573461647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.424523 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-d4p82" podStartSLOduration=104.424484429 podStartE2EDuration="1m44.424484429s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:26.423038421 +0000 UTC m=+125.115859693" watchObservedRunningTime="2025-10-01 10:19:26.424484429 +0000 UTC m=+125.117305701" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.426274 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" podStartSLOduration=105.426264397 podStartE2EDuration="1m45.426264397s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:26.391863336 +0000 UTC m=+125.084684598" watchObservedRunningTime="2025-10-01 10:19:26.426264397 +0000 UTC m=+125.119085669" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.451720 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" podStartSLOduration=104.451696278 podStartE2EDuration="1m44.451696278s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:26.445827881 +0000 UTC m=+125.138649143" watchObservedRunningTime="2025-10-01 10:19:26.451696278 +0000 UTC m=+125.144517550" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.478841 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t7rvp" podStartSLOduration=104.478815955 podStartE2EDuration="1m44.478815955s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:26.477587832 +0000 UTC m=+125.170409094" watchObservedRunningTime="2025-10-01 10:19:26.478815955 +0000 UTC m=+125.171637217" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.481011 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:26 crc kubenswrapper[4735]: E1001 10:19:26.481357 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:26.981345553 +0000 UTC m=+125.674166815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.574882 4735 patch_prober.go:28] interesting pod/apiserver-76f77b778f-8dk8c container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 01 10:19:26 crc kubenswrapper[4735]: [+]log ok Oct 01 10:19:26 crc kubenswrapper[4735]: [+]etcd ok Oct 01 10:19:26 crc kubenswrapper[4735]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 01 10:19:26 crc kubenswrapper[4735]: [+]poststarthook/generic-apiserver-start-informers ok Oct 01 10:19:26 crc kubenswrapper[4735]: [+]poststarthook/max-in-flight-filter ok Oct 01 10:19:26 crc kubenswrapper[4735]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 01 10:19:26 crc kubenswrapper[4735]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 01 10:19:26 crc kubenswrapper[4735]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 01 10:19:26 crc kubenswrapper[4735]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 01 10:19:26 crc kubenswrapper[4735]: [+]poststarthook/project.openshift.io-projectcache ok Oct 01 10:19:26 crc kubenswrapper[4735]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 01 10:19:26 crc kubenswrapper[4735]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Oct 01 10:19:26 crc kubenswrapper[4735]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 01 10:19:26 crc kubenswrapper[4735]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 01 10:19:26 crc kubenswrapper[4735]: livez check failed Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.575212 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" podUID="8b62c04d-523d-4858-bddf-5d2162392962" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.582937 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:26 crc kubenswrapper[4735]: E1001 10:19:26.583234 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:27.083221071 +0000 UTC m=+125.776042333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.614427 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fxcpc" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.622765 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.623353 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.625509 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.625737 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.635896 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.696705 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.697194 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/549e126b-96a3-402b-834f-334631bc3ab7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"549e126b-96a3-402b-834f-334631bc3ab7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.697310 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/549e126b-96a3-402b-834f-334631bc3ab7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"549e126b-96a3-402b-834f-334631bc3ab7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 10:19:26 crc kubenswrapper[4735]: E1001 10:19:26.697699 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:27.197668857 +0000 UTC m=+125.890490119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.801932 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/549e126b-96a3-402b-834f-334631bc3ab7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"549e126b-96a3-402b-834f-334631bc3ab7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.802162 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/549e126b-96a3-402b-834f-334631bc3ab7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"549e126b-96a3-402b-834f-334631bc3ab7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.802317 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:26 crc kubenswrapper[4735]: E1001 10:19:26.802711 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:27.30270019 +0000 UTC m=+125.995521452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.802920 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/549e126b-96a3-402b-834f-334631bc3ab7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"549e126b-96a3-402b-834f-334631bc3ab7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.835568 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/549e126b-96a3-402b-834f-334631bc3ab7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"549e126b-96a3-402b-834f-334631bc3ab7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.906328 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:26 crc kubenswrapper[4735]: E1001 10:19:26.907005 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:27.406986804 +0000 UTC m=+126.099808066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.909934 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l67sz" event={"ID":"9999d995-4881-4349-ba17-89eef6722d16","Type":"ContainerStarted","Data":"e503add64ef0cb90d0d05c4f818acd16811824bf4f9137b281636d8e89414ea0"} Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.909978 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l67sz" event={"ID":"9999d995-4881-4349-ba17-89eef6722d16","Type":"ContainerStarted","Data":"7edc06e8092ad3b3e1a331b534949881e19ca5f4cbc1fc88b89486a1f6b27c37"} Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.911314 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cqpz2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.911352 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" podUID="b79ac2f9-c8bc-4893-ace2-ca598d77ff52" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Oct 01 10:19:26 crc kubenswrapper[4735]: I1001 10:19:26.935756 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.009700 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.014521 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:27.514489563 +0000 UTC m=+126.207310825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.056466 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:27 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:27 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:27 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.056541 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.110783 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.110993 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:27.610962108 +0000 UTC m=+126.303783370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.111118 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.111408 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:27.61139541 +0000 UTC m=+126.304216672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.212671 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.213312 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:27.713296469 +0000 UTC m=+126.406117731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.314247 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.314572 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:27.814560422 +0000 UTC m=+126.507381684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.328627 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 10:19:27 crc kubenswrapper[4735]: W1001 10:19:27.345419 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod549e126b_96a3_402b_834f_334631bc3ab7.slice/crio-0f48194a9e03eb4edddc3d2ff8e2ae0348a76141471717df1024049389be5b33 WatchSource:0}: Error finding container 0f48194a9e03eb4edddc3d2ff8e2ae0348a76141471717df1024049389be5b33: Status 404 returned error can't find the container with id 0f48194a9e03eb4edddc3d2ff8e2ae0348a76141471717df1024049389be5b33 Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.400279 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x7j8p"] Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.401240 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.402311 4735 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.411898 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.414888 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.414983 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:27.914967031 +0000 UTC m=+126.607788293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.415247 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.415517 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:27.915510246 +0000 UTC m=+126.608331508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.431077 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x7j8p"] Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.517399 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.517574 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:28.017548799 +0000 UTC m=+126.710370061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.517661 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj856\" (UniqueName: \"kubernetes.io/projected/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-kube-api-access-dj856\") pod \"certified-operators-x7j8p\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.517778 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-utilities\") pod \"certified-operators-x7j8p\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.517822 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.517903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-catalog-content\") pod \"certified-operators-x7j8p\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.518133 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:28.018122094 +0000 UTC m=+126.710943356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.529099 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dnlqj" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.619384 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.619810 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:28.119673525 +0000 UTC m=+126.812494787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.619912 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-catalog-content\") pod \"certified-operators-x7j8p\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.620328 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-catalog-content\") pod \"certified-operators-x7j8p\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.620670 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj856\" (UniqueName: \"kubernetes.io/projected/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-kube-api-access-dj856\") pod \"certified-operators-x7j8p\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.620770 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-utilities\") pod \"certified-operators-x7j8p\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.621021 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-utilities\") pod \"certified-operators-x7j8p\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.621062 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.621309 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:28.121299678 +0000 UTC m=+126.814120940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.630603 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h9dbw"] Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.631766 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.637887 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.652178 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h9dbw"] Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.673094 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj856\" (UniqueName: \"kubernetes.io/projected/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-kube-api-access-dj856\") pod \"certified-operators-x7j8p\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.720092 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.722179 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.722284 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:28.222267803 +0000 UTC m=+126.915089065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.722419 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-utilities\") pod \"community-operators-h9dbw\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.722464 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.722509 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75zht\" (UniqueName: \"kubernetes.io/projected/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-kube-api-access-75zht\") pod \"community-operators-h9dbw\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.722586 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-catalog-content\") pod \"community-operators-h9dbw\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.722712 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:28.222703044 +0000 UTC m=+126.915524306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.790713 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kq8q9"] Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.791739 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.802584 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kq8q9"] Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.824368 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.824680 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:28.324648705 +0000 UTC m=+127.017469977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.825019 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-catalog-content\") pod \"community-operators-h9dbw\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.825098 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-utilities\") pod \"community-operators-h9dbw\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.825145 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.825172 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75zht\" (UniqueName: \"kubernetes.io/projected/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-kube-api-access-75zht\") pod \"community-operators-h9dbw\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.826190 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-catalog-content\") pod \"community-operators-h9dbw\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.826410 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-utilities\") pod \"community-operators-h9dbw\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.826476 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:28.326459274 +0000 UTC m=+127.019280536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.855327 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75zht\" (UniqueName: \"kubernetes.io/projected/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-kube-api-access-75zht\") pod \"community-operators-h9dbw\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.917169 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l67sz" event={"ID":"9999d995-4881-4349-ba17-89eef6722d16","Type":"ContainerStarted","Data":"dd67c02735dac1983d33b3c3d6b9b4afd9db56d8ea5cf144f615d69052792743"} Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.920817 4735 generic.go:334] "Generic (PLEG): container finished" podID="fa5136cb-51cf-4bc5-8839-d3cbf64e4c39" containerID="87713f8cabf71727c4602948dc62159caee081a69a26b407259c0af4fadd7e48" exitCode=0 Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.920907 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" event={"ID":"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39","Type":"ContainerDied","Data":"87713f8cabf71727c4602948dc62159caee081a69a26b407259c0af4fadd7e48"} Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.924481 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"549e126b-96a3-402b-834f-334631bc3ab7","Type":"ContainerStarted","Data":"72ad81c462fb55609609958e2a4191cbe1cfaed97aa4ea983f0143e2aef87d4d"} Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.924665 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"549e126b-96a3-402b-834f-334631bc3ab7","Type":"ContainerStarted","Data":"0f48194a9e03eb4edddc3d2ff8e2ae0348a76141471717df1024049389be5b33"} Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.927887 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.927976 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:28.427958213 +0000 UTC m=+127.120779475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.928385 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-catalog-content\") pod \"certified-operators-kq8q9\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.928595 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq245\" (UniqueName: \"kubernetes.io/projected/3adb5d44-1cc9-4955-8ca4-a27b4e561251-kube-api-access-lq245\") pod \"certified-operators-kq8q9\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.928630 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-utilities\") pod \"certified-operators-kq8q9\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.928694 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:27 crc kubenswrapper[4735]: E1001 10:19:27.929118 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 10:19:28.429102404 +0000 UTC m=+127.121923736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-klpcq" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.941735 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-l67sz" podStartSLOduration=9.941710841999999 podStartE2EDuration="9.941710842s" podCreationTimestamp="2025-10-01 10:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:27.938124425 +0000 UTC m=+126.630945687" watchObservedRunningTime="2025-10-01 10:19:27.941710842 +0000 UTC m=+126.634532104" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.948163 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:19:27 crc kubenswrapper[4735]: I1001 10:19:27.959857 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.959788015 podStartE2EDuration="1.959788015s" podCreationTimestamp="2025-10-01 10:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:27.956359953 +0000 UTC m=+126.649181215" watchObservedRunningTime="2025-10-01 10:19:27.959788015 +0000 UTC m=+126.652609287" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.003066 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d9sm9"] Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.004392 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.026757 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x7j8p"] Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.029472 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.029896 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq245\" (UniqueName: \"kubernetes.io/projected/3adb5d44-1cc9-4955-8ca4-a27b4e561251-kube-api-access-lq245\") pod \"certified-operators-kq8q9\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.029923 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-utilities\") pod \"certified-operators-kq8q9\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.030133 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-catalog-content\") pod \"certified-operators-kq8q9\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:19:28 crc kubenswrapper[4735]: E1001 10:19:28.030838 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 10:19:28.530822548 +0000 UTC m=+127.223643810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.033945 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d9sm9"] Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.036451 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-utilities\") pod \"certified-operators-kq8q9\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.037209 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-catalog-content\") pod \"certified-operators-kq8q9\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.051300 4735 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-01T10:19:27.402332572Z","Handler":null,"Name":""} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.053566 4735 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.053591 4735 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.061273 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq245\" (UniqueName: \"kubernetes.io/projected/3adb5d44-1cc9-4955-8ca4-a27b4e561251-kube-api-access-lq245\") pod \"certified-operators-kq8q9\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.062059 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:28 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:28 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:28 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.062115 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.109916 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.131423 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-utilities\") pod \"community-operators-d9sm9\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.131845 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.131907 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqb7k\" (UniqueName: \"kubernetes.io/projected/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-kube-api-access-mqb7k\") pod \"community-operators-d9sm9\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.131942 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-catalog-content\") pod \"community-operators-d9sm9\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.136198 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.136239 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.190929 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-klpcq\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.234740 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.235801 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqb7k\" (UniqueName: \"kubernetes.io/projected/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-kube-api-access-mqb7k\") pod \"community-operators-d9sm9\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.235849 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-catalog-content\") pod \"community-operators-d9sm9\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.235939 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-utilities\") pod \"community-operators-d9sm9\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.236849 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-utilities\") pod \"community-operators-d9sm9\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.237084 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-catalog-content\") pod \"community-operators-d9sm9\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.247244 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.261322 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqb7k\" (UniqueName: \"kubernetes.io/projected/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-kube-api-access-mqb7k\") pod \"community-operators-d9sm9\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.269107 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.277765 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.302976 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h9dbw"] Oct 01 10:19:28 crc kubenswrapper[4735]: W1001 10:19:28.339844 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac706b67_93f2_4ccd_a0e8_7ebb309bc905.slice/crio-6fdcc09b20be807d3f66e215ae8d9ce3c47b399fe46a570c77c80da146b757c9 WatchSource:0}: Error finding container 6fdcc09b20be807d3f66e215ae8d9ce3c47b399fe46a570c77c80da146b757c9: Status 404 returned error can't find the container with id 6fdcc09b20be807d3f66e215ae8d9ce3c47b399fe46a570c77c80da146b757c9 Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.343732 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.369717 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kq8q9"] Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.579110 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d9sm9"] Oct 01 10:19:28 crc kubenswrapper[4735]: W1001 10:19:28.613982 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5608a54e_a42f_4e14_aa3e_a5503c7a4dc4.slice/crio-0d3dfd3fa57498da0ddf3bb0f95a9a76b7cbfac8e3d3b37076bad2e3afebbac8 WatchSource:0}: Error finding container 0d3dfd3fa57498da0ddf3bb0f95a9a76b7cbfac8e3d3b37076bad2e3afebbac8: Status 404 returned error can't find the container with id 0d3dfd3fa57498da0ddf3bb0f95a9a76b7cbfac8e3d3b37076bad2e3afebbac8 Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.721289 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-klpcq"] Oct 01 10:19:28 crc kubenswrapper[4735]: W1001 10:19:28.736311 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19fd9940_52eb_4a65_8e75_531a27563c1b.slice/crio-c757aab0c46ee644eb24762a7cdc62f339e491b5a51f297d409280012202302f WatchSource:0}: Error finding container c757aab0c46ee644eb24762a7cdc62f339e491b5a51f297d409280012202302f: Status 404 returned error can't find the container with id c757aab0c46ee644eb24762a7cdc62f339e491b5a51f297d409280012202302f Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.931156 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" event={"ID":"19fd9940-52eb-4a65-8e75-531a27563c1b","Type":"ContainerStarted","Data":"f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55"} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.931208 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" event={"ID":"19fd9940-52eb-4a65-8e75-531a27563c1b","Type":"ContainerStarted","Data":"c757aab0c46ee644eb24762a7cdc62f339e491b5a51f297d409280012202302f"} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.931264 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.932795 4735 generic.go:334] "Generic (PLEG): container finished" podID="549e126b-96a3-402b-834f-334631bc3ab7" containerID="72ad81c462fb55609609958e2a4191cbe1cfaed97aa4ea983f0143e2aef87d4d" exitCode=0 Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.932854 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"549e126b-96a3-402b-834f-334631bc3ab7","Type":"ContainerDied","Data":"72ad81c462fb55609609958e2a4191cbe1cfaed97aa4ea983f0143e2aef87d4d"} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.934567 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" containerID="c37d4aff5240b0d74e8f163e0ba7a656bca8fe42d2da0905799c2053c43e3684" exitCode=0 Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.934642 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9dbw" event={"ID":"ac706b67-93f2-4ccd-a0e8-7ebb309bc905","Type":"ContainerDied","Data":"c37d4aff5240b0d74e8f163e0ba7a656bca8fe42d2da0905799c2053c43e3684"} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.934667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9dbw" event={"ID":"ac706b67-93f2-4ccd-a0e8-7ebb309bc905","Type":"ContainerStarted","Data":"6fdcc09b20be807d3f66e215ae8d9ce3c47b399fe46a570c77c80da146b757c9"} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.936043 4735 generic.go:334] "Generic (PLEG): container finished" podID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" containerID="dfcfd4ccf161be8a6da28520192e7c12b9c8bdf77dc7f95e830dc32b0ef52ce3" exitCode=0 Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.936073 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kq8q9" event={"ID":"3adb5d44-1cc9-4955-8ca4-a27b4e561251","Type":"ContainerDied","Data":"dfcfd4ccf161be8a6da28520192e7c12b9c8bdf77dc7f95e830dc32b0ef52ce3"} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.936252 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kq8q9" event={"ID":"3adb5d44-1cc9-4955-8ca4-a27b4e561251","Type":"ContainerStarted","Data":"0ce88546cea0ce1839f623c0f211aaacb56d602300b0dd7929991ab561a8edd3"} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.936534 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.937556 4735 generic.go:334] "Generic (PLEG): container finished" podID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" containerID="3c4f44fa70bb4e00f8f21d1221983535a1fad684324a8fa59ff117354160c913" exitCode=0 Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.937654 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9sm9" event={"ID":"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4","Type":"ContainerDied","Data":"3c4f44fa70bb4e00f8f21d1221983535a1fad684324a8fa59ff117354160c913"} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.937923 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9sm9" event={"ID":"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4","Type":"ContainerStarted","Data":"0d3dfd3fa57498da0ddf3bb0f95a9a76b7cbfac8e3d3b37076bad2e3afebbac8"} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.940657 4735 generic.go:334] "Generic (PLEG): container finished" podID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" containerID="c5cf2f19bb625e66680192288845ec7e9855743f7955817ea5205ea8c9f03f59" exitCode=0 Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.940757 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7j8p" event={"ID":"66aa965b-9b48-44f3-9d53-e1b0ff1829dd","Type":"ContainerDied","Data":"c5cf2f19bb625e66680192288845ec7e9855743f7955817ea5205ea8c9f03f59"} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.940781 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7j8p" event={"ID":"66aa965b-9b48-44f3-9d53-e1b0ff1829dd","Type":"ContainerStarted","Data":"f7314ffd4ffc1aa1db6927c7916ce18d03387ed815fba716dc042108832c29bf"} Oct 01 10:19:28 crc kubenswrapper[4735]: I1001 10:19:28.957618 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" podStartSLOduration=106.957592483 podStartE2EDuration="1m46.957592483s" podCreationTimestamp="2025-10-01 10:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:28.951043718 +0000 UTC m=+127.643864990" watchObservedRunningTime="2025-10-01 10:19:28.957592483 +0000 UTC m=+127.650413735" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.116485 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:29 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:29 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:29 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.116577 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.166390 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.251430 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-config-volume\") pod \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.251528 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f22tm\" (UniqueName: \"kubernetes.io/projected/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-kube-api-access-f22tm\") pod \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.251573 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-secret-volume\") pod \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\" (UID: \"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39\") " Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.252143 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa5136cb-51cf-4bc5-8839-d3cbf64e4c39" (UID: "fa5136cb-51cf-4bc5-8839-d3cbf64e4c39"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.256186 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa5136cb-51cf-4bc5-8839-d3cbf64e4c39" (UID: "fa5136cb-51cf-4bc5-8839-d3cbf64e4c39"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.256232 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-kube-api-access-f22tm" (OuterVolumeSpecName: "kube-api-access-f22tm") pod "fa5136cb-51cf-4bc5-8839-d3cbf64e4c39" (UID: "fa5136cb-51cf-4bc5-8839-d3cbf64e4c39"). InnerVolumeSpecName "kube-api-access-f22tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.353524 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.353631 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.353642 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f22tm\" (UniqueName: \"kubernetes.io/projected/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39-kube-api-access-f22tm\") on node \"crc\" DevicePath \"\"" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.591370 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hx799"] Oct 01 10:19:29 crc kubenswrapper[4735]: E1001 10:19:29.591620 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5136cb-51cf-4bc5-8839-d3cbf64e4c39" containerName="collect-profiles" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.591638 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5136cb-51cf-4bc5-8839-d3cbf64e4c39" containerName="collect-profiles" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.591772 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5136cb-51cf-4bc5-8839-d3cbf64e4c39" containerName="collect-profiles" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.592532 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.595530 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.607611 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hx799"] Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.656667 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-catalog-content\") pod \"redhat-marketplace-hx799\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.656722 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-utilities\") pod \"redhat-marketplace-hx799\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.656777 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls2pr\" (UniqueName: \"kubernetes.io/projected/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-kube-api-access-ls2pr\") pod \"redhat-marketplace-hx799\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.758303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-catalog-content\") pod \"redhat-marketplace-hx799\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.758970 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-utilities\") pod \"redhat-marketplace-hx799\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.759004 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls2pr\" (UniqueName: \"kubernetes.io/projected/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-kube-api-access-ls2pr\") pod \"redhat-marketplace-hx799\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.759048 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-catalog-content\") pod \"redhat-marketplace-hx799\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.759524 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-utilities\") pod \"redhat-marketplace-hx799\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.775161 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls2pr\" (UniqueName: \"kubernetes.io/projected/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-kube-api-access-ls2pr\") pod \"redhat-marketplace-hx799\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.907066 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.910067 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.949752 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" event={"ID":"fa5136cb-51cf-4bc5-8839-d3cbf64e4c39","Type":"ContainerDied","Data":"c2bc6bdc2b2d7708ce412532abde428d99bc1b06336686d6c59270c362cea21b"} Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.949809 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.949821 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2bc6bdc2b2d7708ce412532abde428d99bc1b06336686d6c59270c362cea21b" Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.994358 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k844p"] Oct 01 10:19:29 crc kubenswrapper[4735]: I1001 10:19:29.995354 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.003928 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k844p"] Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.060149 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:30 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:30 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:30 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.060204 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.068627 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44sbb\" (UniqueName: \"kubernetes.io/projected/0c167fd0-1242-4be7-8261-391836d921ae-kube-api-access-44sbb\") pod \"redhat-marketplace-k844p\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.068675 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-utilities\") pod \"redhat-marketplace-k844p\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.068735 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-catalog-content\") pod \"redhat-marketplace-k844p\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.170189 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-catalog-content\") pod \"redhat-marketplace-k844p\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.170335 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44sbb\" (UniqueName: \"kubernetes.io/projected/0c167fd0-1242-4be7-8261-391836d921ae-kube-api-access-44sbb\") pod \"redhat-marketplace-k844p\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.170354 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-utilities\") pod \"redhat-marketplace-k844p\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.170883 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-catalog-content\") pod \"redhat-marketplace-k844p\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.170950 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-utilities\") pod \"redhat-marketplace-k844p\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.201302 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44sbb\" (UniqueName: \"kubernetes.io/projected/0c167fd0-1242-4be7-8261-391836d921ae-kube-api-access-44sbb\") pod \"redhat-marketplace-k844p\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.208033 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.271324 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/549e126b-96a3-402b-834f-334631bc3ab7-kube-api-access\") pod \"549e126b-96a3-402b-834f-334631bc3ab7\" (UID: \"549e126b-96a3-402b-834f-334631bc3ab7\") " Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.271371 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/549e126b-96a3-402b-834f-334631bc3ab7-kubelet-dir\") pod \"549e126b-96a3-402b-834f-334631bc3ab7\" (UID: \"549e126b-96a3-402b-834f-334631bc3ab7\") " Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.271530 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/549e126b-96a3-402b-834f-334631bc3ab7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "549e126b-96a3-402b-834f-334631bc3ab7" (UID: "549e126b-96a3-402b-834f-334631bc3ab7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.271745 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/549e126b-96a3-402b-834f-334631bc3ab7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.274984 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549e126b-96a3-402b-834f-334631bc3ab7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "549e126b-96a3-402b-834f-334631bc3ab7" (UID: "549e126b-96a3-402b-834f-334631bc3ab7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.340978 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.373902 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/549e126b-96a3-402b-834f-334631bc3ab7-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.417402 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hx799"] Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.589245 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.594321 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6nvtz"] Oct 01 10:19:30 crc kubenswrapper[4735]: E1001 10:19:30.594559 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549e126b-96a3-402b-834f-334631bc3ab7" containerName="pruner" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.594590 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="549e126b-96a3-402b-834f-334631bc3ab7" containerName="pruner" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.594747 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="549e126b-96a3-402b-834f-334631bc3ab7" containerName="pruner" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.595511 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.601822 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.606022 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-8dk8c" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.607435 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6nvtz"] Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.693818 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-catalog-content\") pod \"redhat-operators-6nvtz\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.693931 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-utilities\") pod \"redhat-operators-6nvtz\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.694179 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckxs\" (UniqueName: \"kubernetes.io/projected/e3285eed-d809-4408-af84-a2020d07c59c-kube-api-access-gckxs\") pod \"redhat-operators-6nvtz\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.710890 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-dm5x5 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.710935 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dm5x5" podUID="c8642ffb-52ec-44db-b2f8-33a1e98b5328" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.712185 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-dm5x5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.712236 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dm5x5" podUID="c8642ffb-52ec-44db-b2f8-33a1e98b5328" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.760129 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k844p"] Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.797154 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gckxs\" (UniqueName: \"kubernetes.io/projected/e3285eed-d809-4408-af84-a2020d07c59c-kube-api-access-gckxs\") pod \"redhat-operators-6nvtz\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.797234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-catalog-content\") pod \"redhat-operators-6nvtz\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.797281 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-utilities\") pod \"redhat-operators-6nvtz\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.797727 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-utilities\") pod \"redhat-operators-6nvtz\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.798157 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-catalog-content\") pod \"redhat-operators-6nvtz\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.814741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gckxs\" (UniqueName: \"kubernetes.io/projected/e3285eed-d809-4408-af84-a2020d07c59c-kube-api-access-gckxs\") pod \"redhat-operators-6nvtz\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.924108 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.961617 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k844p" event={"ID":"0c167fd0-1242-4be7-8261-391836d921ae","Type":"ContainerStarted","Data":"32db815162b96c231fcf92135f41c789676f03e56280c42a395a7424dd1378e9"} Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.965400 4735 generic.go:334] "Generic (PLEG): container finished" podID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" containerID="adfe18d10a77232c579fe4bfc1506266a5d93c1c31cffc76cfca8c13d6cbf62d" exitCode=0 Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.965471 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hx799" event={"ID":"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6","Type":"ContainerDied","Data":"adfe18d10a77232c579fe4bfc1506266a5d93c1c31cffc76cfca8c13d6cbf62d"} Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.965515 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hx799" event={"ID":"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6","Type":"ContainerStarted","Data":"70cd9772e346201ac41ad6b6c3f6af23c2c03d962ffd0c52616c57af3166745f"} Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.969595 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"549e126b-96a3-402b-834f-334631bc3ab7","Type":"ContainerDied","Data":"0f48194a9e03eb4edddc3d2ff8e2ae0348a76141471717df1024049389be5b33"} Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.969640 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f48194a9e03eb4edddc3d2ff8e2ae0348a76141471717df1024049389be5b33" Oct 01 10:19:30 crc kubenswrapper[4735]: I1001 10:19:30.969666 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.000463 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5z55n"] Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.001593 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.011983 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5z55n"] Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.055131 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.061074 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:31 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:31 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:31 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.061130 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.109608 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-utilities\") pod \"redhat-operators-5z55n\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.109817 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-catalog-content\") pod \"redhat-operators-5z55n\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.109952 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtr5n\" (UniqueName: \"kubernetes.io/projected/a7191c19-9c62-4c02-851c-1f871aed06f2-kube-api-access-jtr5n\") pod \"redhat-operators-5z55n\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.210671 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-utilities\") pod \"redhat-operators-5z55n\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.211062 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-catalog-content\") pod \"redhat-operators-5z55n\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.211119 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-utilities\") pod \"redhat-operators-5z55n\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.211145 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtr5n\" (UniqueName: \"kubernetes.io/projected/a7191c19-9c62-4c02-851c-1f871aed06f2-kube-api-access-jtr5n\") pod \"redhat-operators-5z55n\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.211393 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-catalog-content\") pod \"redhat-operators-5z55n\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.228456 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtr5n\" (UniqueName: \"kubernetes.io/projected/a7191c19-9c62-4c02-851c-1f871aed06f2-kube-api-access-jtr5n\") pod \"redhat-operators-5z55n\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.319121 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.391009 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.391081 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.400648 4735 patch_prober.go:28] interesting pod/console-f9d7485db-xtfsg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.37:8443/health\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.400699 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-xtfsg" podUID="5e28238a-9bf2-4e10-827d-7350e0ec0150" containerName="console" probeResult="failure" output="Get \"https://10.217.0.37:8443/health\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.460106 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.460861 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.463857 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.464904 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.465089 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.505070 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6nvtz"] Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.514345 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4ff8717-b8b7-445a-a152-4c1aab50702d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c4ff8717-b8b7-445a-a152-4c1aab50702d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.514737 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4ff8717-b8b7-445a-a152-4c1aab50702d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c4ff8717-b8b7-445a-a152-4c1aab50702d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.615989 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4ff8717-b8b7-445a-a152-4c1aab50702d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c4ff8717-b8b7-445a-a152-4c1aab50702d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.616052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4ff8717-b8b7-445a-a152-4c1aab50702d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c4ff8717-b8b7-445a-a152-4c1aab50702d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.616134 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4ff8717-b8b7-445a-a152-4c1aab50702d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c4ff8717-b8b7-445a-a152-4c1aab50702d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.617197 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5z55n"] Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.632722 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4ff8717-b8b7-445a-a152-4c1aab50702d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c4ff8717-b8b7-445a-a152-4c1aab50702d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 10:19:31 crc kubenswrapper[4735]: W1001 10:19:31.640926 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7191c19_9c62_4c02_851c_1f871aed06f2.slice/crio-2b6348e36a0c789c5650b286baeb2be09925098b49d6c3298233b9c3055b316d WatchSource:0}: Error finding container 2b6348e36a0c789c5650b286baeb2be09925098b49d6c3298233b9c3055b316d: Status 404 returned error can't find the container with id 2b6348e36a0c789c5650b286baeb2be09925098b49d6c3298233b9c3055b316d Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.778046 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.781436 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.986384 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z55n" event={"ID":"a7191c19-9c62-4c02-851c-1f871aed06f2","Type":"ContainerStarted","Data":"2b6348e36a0c789c5650b286baeb2be09925098b49d6c3298233b9c3055b316d"} Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.993197 4735 generic.go:334] "Generic (PLEG): container finished" podID="0c167fd0-1242-4be7-8261-391836d921ae" containerID="829a34e3557f2f5a309628835ac9571058879b0f2dcfa81d77d4634fc42b3098" exitCode=0 Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.993267 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k844p" event={"ID":"0c167fd0-1242-4be7-8261-391836d921ae","Type":"ContainerDied","Data":"829a34e3557f2f5a309628835ac9571058879b0f2dcfa81d77d4634fc42b3098"} Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.997856 4735 generic.go:334] "Generic (PLEG): container finished" podID="e3285eed-d809-4408-af84-a2020d07c59c" containerID="edf6db42a739636c3e4a36b15663ff296a3c322cd059ad67f23c58bc8f0cc4c6" exitCode=0 Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.998465 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nvtz" event={"ID":"e3285eed-d809-4408-af84-a2020d07c59c","Type":"ContainerDied","Data":"edf6db42a739636c3e4a36b15663ff296a3c322cd059ad67f23c58bc8f0cc4c6"} Oct 01 10:19:31 crc kubenswrapper[4735]: I1001 10:19:31.998515 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nvtz" event={"ID":"e3285eed-d809-4408-af84-a2020d07c59c","Type":"ContainerStarted","Data":"380ecc761a26637454b99755dd2aa275734207a55c18a6107462cb94f4756063"} Oct 01 10:19:32 crc kubenswrapper[4735]: I1001 10:19:32.056636 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:32 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:32 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:32 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:32 crc kubenswrapper[4735]: I1001 10:19:32.056713 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:32 crc kubenswrapper[4735]: I1001 10:19:32.119146 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 10:19:32 crc kubenswrapper[4735]: W1001 10:19:32.133845 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc4ff8717_b8b7_445a_a152_4c1aab50702d.slice/crio-e6ef5dca77bf33dcdae454ef32601a15d063c4813daec278efcc0f02d8869c12 WatchSource:0}: Error finding container e6ef5dca77bf33dcdae454ef32601a15d063c4813daec278efcc0f02d8869c12: Status 404 returned error can't find the container with id e6ef5dca77bf33dcdae454ef32601a15d063c4813daec278efcc0f02d8869c12 Oct 01 10:19:33 crc kubenswrapper[4735]: I1001 10:19:33.016599 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c4ff8717-b8b7-445a-a152-4c1aab50702d","Type":"ContainerStarted","Data":"e6ef5dca77bf33dcdae454ef32601a15d063c4813daec278efcc0f02d8869c12"} Oct 01 10:19:33 crc kubenswrapper[4735]: I1001 10:19:33.020766 4735 generic.go:334] "Generic (PLEG): container finished" podID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerID="828640d5c654ee463a83e8c556eaf621d0ea7740ce62bb9adf61025040e32bad" exitCode=0 Oct 01 10:19:33 crc kubenswrapper[4735]: I1001 10:19:33.020816 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z55n" event={"ID":"a7191c19-9c62-4c02-851c-1f871aed06f2","Type":"ContainerDied","Data":"828640d5c654ee463a83e8c556eaf621d0ea7740ce62bb9adf61025040e32bad"} Oct 01 10:19:33 crc kubenswrapper[4735]: I1001 10:19:33.056183 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:33 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:33 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:33 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:33 crc kubenswrapper[4735]: I1001 10:19:33.056228 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:33 crc kubenswrapper[4735]: I1001 10:19:33.558261 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qmlh7" Oct 01 10:19:34 crc kubenswrapper[4735]: I1001 10:19:34.030691 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c4ff8717-b8b7-445a-a152-4c1aab50702d","Type":"ContainerStarted","Data":"63fbaf44e6fbd92873465a731a3f4c9590c6c9b10f67e7957318b3e5efd33d50"} Oct 01 10:19:34 crc kubenswrapper[4735]: I1001 10:19:34.044698 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.044680211 podStartE2EDuration="3.044680211s" podCreationTimestamp="2025-10-01 10:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:19:34.043187351 +0000 UTC m=+132.736008613" watchObservedRunningTime="2025-10-01 10:19:34.044680211 +0000 UTC m=+132.737501473" Oct 01 10:19:34 crc kubenswrapper[4735]: I1001 10:19:34.057306 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:34 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:34 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:34 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:34 crc kubenswrapper[4735]: I1001 10:19:34.057365 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:35 crc kubenswrapper[4735]: I1001 10:19:35.046248 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4ff8717-b8b7-445a-a152-4c1aab50702d" containerID="63fbaf44e6fbd92873465a731a3f4c9590c6c9b10f67e7957318b3e5efd33d50" exitCode=0 Oct 01 10:19:35 crc kubenswrapper[4735]: I1001 10:19:35.046300 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c4ff8717-b8b7-445a-a152-4c1aab50702d","Type":"ContainerDied","Data":"63fbaf44e6fbd92873465a731a3f4c9590c6c9b10f67e7957318b3e5efd33d50"} Oct 01 10:19:35 crc kubenswrapper[4735]: I1001 10:19:35.056867 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:35 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:35 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:35 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:35 crc kubenswrapper[4735]: I1001 10:19:35.056918 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:36 crc kubenswrapper[4735]: I1001 10:19:36.058922 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:36 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:36 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:36 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:36 crc kubenswrapper[4735]: I1001 10:19:36.059210 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:37 crc kubenswrapper[4735]: I1001 10:19:37.055846 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:37 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:37 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:37 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:37 crc kubenswrapper[4735]: I1001 10:19:37.055909 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:38 crc kubenswrapper[4735]: I1001 10:19:38.055929 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:38 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:38 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:38 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:38 crc kubenswrapper[4735]: I1001 10:19:38.056187 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:38 crc kubenswrapper[4735]: I1001 10:19:38.235540 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 10:19:38 crc kubenswrapper[4735]: I1001 10:19:38.317758 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4ff8717-b8b7-445a-a152-4c1aab50702d-kube-api-access\") pod \"c4ff8717-b8b7-445a-a152-4c1aab50702d\" (UID: \"c4ff8717-b8b7-445a-a152-4c1aab50702d\") " Oct 01 10:19:38 crc kubenswrapper[4735]: I1001 10:19:38.317987 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4ff8717-b8b7-445a-a152-4c1aab50702d-kubelet-dir\") pod \"c4ff8717-b8b7-445a-a152-4c1aab50702d\" (UID: \"c4ff8717-b8b7-445a-a152-4c1aab50702d\") " Oct 01 10:19:38 crc kubenswrapper[4735]: I1001 10:19:38.318134 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4ff8717-b8b7-445a-a152-4c1aab50702d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c4ff8717-b8b7-445a-a152-4c1aab50702d" (UID: "c4ff8717-b8b7-445a-a152-4c1aab50702d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:19:38 crc kubenswrapper[4735]: I1001 10:19:38.318446 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4ff8717-b8b7-445a-a152-4c1aab50702d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 10:19:38 crc kubenswrapper[4735]: I1001 10:19:38.324005 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ff8717-b8b7-445a-a152-4c1aab50702d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c4ff8717-b8b7-445a-a152-4c1aab50702d" (UID: "c4ff8717-b8b7-445a-a152-4c1aab50702d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:19:38 crc kubenswrapper[4735]: I1001 10:19:38.419318 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4ff8717-b8b7-445a-a152-4c1aab50702d-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 10:19:39 crc kubenswrapper[4735]: I1001 10:19:39.056131 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:39 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:39 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:39 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:39 crc kubenswrapper[4735]: I1001 10:19:39.056192 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:39 crc kubenswrapper[4735]: I1001 10:19:39.067817 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c4ff8717-b8b7-445a-a152-4c1aab50702d","Type":"ContainerDied","Data":"e6ef5dca77bf33dcdae454ef32601a15d063c4813daec278efcc0f02d8869c12"} Oct 01 10:19:39 crc kubenswrapper[4735]: I1001 10:19:39.067848 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6ef5dca77bf33dcdae454ef32601a15d063c4813daec278efcc0f02d8869c12" Oct 01 10:19:39 crc kubenswrapper[4735]: I1001 10:19:39.067899 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 10:19:40 crc kubenswrapper[4735]: I1001 10:19:40.055987 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:40 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:40 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:40 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:40 crc kubenswrapper[4735]: I1001 10:19:40.056040 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:40 crc kubenswrapper[4735]: I1001 10:19:40.718217 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dm5x5" Oct 01 10:19:41 crc kubenswrapper[4735]: I1001 10:19:41.056030 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:41 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:41 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:41 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:41 crc kubenswrapper[4735]: I1001 10:19:41.056078 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:41 crc kubenswrapper[4735]: I1001 10:19:41.392646 4735 patch_prober.go:28] interesting pod/console-f9d7485db-xtfsg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.37:8443/health\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 01 10:19:41 crc kubenswrapper[4735]: I1001 10:19:41.392764 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-xtfsg" podUID="5e28238a-9bf2-4e10-827d-7350e0ec0150" containerName="console" probeResult="failure" output="Get \"https://10.217.0.37:8443/health\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 01 10:19:42 crc kubenswrapper[4735]: I1001 10:19:42.055603 4735 patch_prober.go:28] interesting pod/router-default-5444994796-qc826 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 10:19:42 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Oct 01 10:19:42 crc kubenswrapper[4735]: [+]process-running ok Oct 01 10:19:42 crc kubenswrapper[4735]: healthz check failed Oct 01 10:19:42 crc kubenswrapper[4735]: I1001 10:19:42.055657 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qc826" podUID="79bfc45c-3ae9-416b-9d4b-f56fab2387de" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 10:19:43 crc kubenswrapper[4735]: I1001 10:19:43.056560 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:43 crc kubenswrapper[4735]: I1001 10:19:43.058293 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-qc826" Oct 01 10:19:48 crc kubenswrapper[4735]: E1001 10:19:48.262512 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 10:19:48 crc kubenswrapper[4735]: E1001 10:19:48.263192 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mqb7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-d9sm9_openshift-marketplace(5608a54e-a42f-4e14-aa3e-a5503c7a4dc4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 10:19:48 crc kubenswrapper[4735]: E1001 10:19:48.264349 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-d9sm9" podUID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.282804 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.952785 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.952830 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.952872 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.952910 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.955066 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.955075 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.955130 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.965510 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.973735 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.977461 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:48 crc kubenswrapper[4735]: I1001 10:19:48.977621 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:49 crc kubenswrapper[4735]: I1001 10:19:49.216523 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 10:19:49 crc kubenswrapper[4735]: I1001 10:19:49.220399 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:49 crc kubenswrapper[4735]: I1001 10:19:49.223278 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:19:49 crc kubenswrapper[4735]: E1001 10:19:49.292212 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-d9sm9" podUID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" Oct 01 10:19:49 crc kubenswrapper[4735]: I1001 10:19:49.509319 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 10:19:50 crc kubenswrapper[4735]: E1001 10:19:50.397367 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 10:19:50 crc kubenswrapper[4735]: E1001 10:19:50.397564 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lq245,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kq8q9_openshift-marketplace(3adb5d44-1cc9-4955-8ca4-a27b4e561251): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 10:19:50 crc kubenswrapper[4735]: E1001 10:19:50.399132 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kq8q9" podUID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" Oct 01 10:19:51 crc kubenswrapper[4735]: I1001 10:19:51.399563 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:51 crc kubenswrapper[4735]: I1001 10:19:51.405927 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:19:56 crc kubenswrapper[4735]: E1001 10:19:56.953397 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kq8q9" podUID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" Oct 01 10:19:58 crc kubenswrapper[4735]: E1001 10:19:58.030792 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 10:19:58 crc kubenswrapper[4735]: E1001 10:19:58.031282 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jtr5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5z55n_openshift-marketplace(a7191c19-9c62-4c02-851c-1f871aed06f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 10:19:58 crc kubenswrapper[4735]: E1001 10:19:58.032555 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5z55n" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" Oct 01 10:20:00 crc kubenswrapper[4735]: E1001 10:20:00.843818 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5z55n" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" Oct 01 10:20:00 crc kubenswrapper[4735]: E1001 10:20:00.860354 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 10:20:00 crc kubenswrapper[4735]: E1001 10:20:00.860484 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-44sbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-k844p_openshift-marketplace(0c167fd0-1242-4be7-8261-391836d921ae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 10:20:00 crc kubenswrapper[4735]: E1001 10:20:00.861570 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-k844p" podUID="0c167fd0-1242-4be7-8261-391836d921ae" Oct 01 10:20:01 crc kubenswrapper[4735]: E1001 10:20:01.437122 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-k844p" podUID="0c167fd0-1242-4be7-8261-391836d921ae" Oct 01 10:20:01 crc kubenswrapper[4735]: I1001 10:20:01.451299 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fkcd4" Oct 01 10:20:01 crc kubenswrapper[4735]: E1001 10:20:01.501845 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 10:20:01 crc kubenswrapper[4735]: E1001 10:20:01.501994 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dj856,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-x7j8p_openshift-marketplace(66aa965b-9b48-44f3-9d53-e1b0ff1829dd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 10:20:01 crc kubenswrapper[4735]: E1001 10:20:01.503370 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-x7j8p" podUID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" Oct 01 10:20:01 crc kubenswrapper[4735]: E1001 10:20:01.526969 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 10:20:01 crc kubenswrapper[4735]: E1001 10:20:01.527313 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gckxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-6nvtz_openshift-marketplace(e3285eed-d809-4408-af84-a2020d07c59c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 10:20:01 crc kubenswrapper[4735]: E1001 10:20:01.528878 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-6nvtz" podUID="e3285eed-d809-4408-af84-a2020d07c59c" Oct 01 10:20:01 crc kubenswrapper[4735]: W1001 10:20:01.889968 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-c6233bdcf7de442045bc0e426f0dbf092115068603c26fee10e563843c635e82 WatchSource:0}: Error finding container c6233bdcf7de442045bc0e426f0dbf092115068603c26fee10e563843c635e82: Status 404 returned error can't find the container with id c6233bdcf7de442045bc0e426f0dbf092115068603c26fee10e563843c635e82 Oct 01 10:20:01 crc kubenswrapper[4735]: W1001 10:20:01.891435 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-0543162bd1263543441f591ee619e26bbee1ee37911d8dae3475cb27ebd397a1 WatchSource:0}: Error finding container 0543162bd1263543441f591ee619e26bbee1ee37911d8dae3475cb27ebd397a1: Status 404 returned error can't find the container with id 0543162bd1263543441f591ee619e26bbee1ee37911d8dae3475cb27ebd397a1 Oct 01 10:20:01 crc kubenswrapper[4735]: W1001 10:20:01.894405 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-10acc4ff89d917d717b20ae0d8af8e072519d929aab61c0b1a92353f3bbf4783 WatchSource:0}: Error finding container 10acc4ff89d917d717b20ae0d8af8e072519d929aab61c0b1a92353f3bbf4783: Status 404 returned error can't find the container with id 10acc4ff89d917d717b20ae0d8af8e072519d929aab61c0b1a92353f3bbf4783 Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.185116 4735 generic.go:334] "Generic (PLEG): container finished" podID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" containerID="adbd8bf87bb89f361f4685aa00683088a4de0103e85750b34b4b44f62086dde7" exitCode=0 Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.185430 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hx799" event={"ID":"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6","Type":"ContainerDied","Data":"adbd8bf87bb89f361f4685aa00683088a4de0103e85750b34b4b44f62086dde7"} Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.189175 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9sm9" event={"ID":"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4","Type":"ContainerStarted","Data":"4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3"} Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.191922 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"91d4a031949455b4175f36ac69d6092712bfccbe35146f9db9fdc8ae42b4b444"} Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.191956 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"10acc4ff89d917d717b20ae0d8af8e072519d929aab61c0b1a92353f3bbf4783"} Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.192122 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.193441 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"694188e9215966d9f368f7082917d163d7702e76065462ae72fb2a071e16b314"} Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.193465 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c6233bdcf7de442045bc0e426f0dbf092115068603c26fee10e563843c635e82"} Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.195026 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"463c4912961a74f7a0cd35b9db1eb13d9049ede0a5ce1a18a91a52f9fe9a2c81"} Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.195055 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0543162bd1263543441f591ee619e26bbee1ee37911d8dae3475cb27ebd397a1"} Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.196988 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" containerID="bbd329908b04eb22be53d2f9280258faf04450a4da788192f8568a736a8f3a6c" exitCode=0 Oct 01 10:20:02 crc kubenswrapper[4735]: I1001 10:20:02.197071 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9dbw" event={"ID":"ac706b67-93f2-4ccd-a0e8-7ebb309bc905","Type":"ContainerDied","Data":"bbd329908b04eb22be53d2f9280258faf04450a4da788192f8568a736a8f3a6c"} Oct 01 10:20:02 crc kubenswrapper[4735]: E1001 10:20:02.198286 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-6nvtz" podUID="e3285eed-d809-4408-af84-a2020d07c59c" Oct 01 10:20:02 crc kubenswrapper[4735]: E1001 10:20:02.198297 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-x7j8p" podUID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.203658 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hx799" event={"ID":"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6","Type":"ContainerStarted","Data":"9b6258e8be5cb301106d12900e8541b471227238253b0560400711cd77b8c8db"} Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.207205 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9dbw" event={"ID":"ac706b67-93f2-4ccd-a0e8-7ebb309bc905","Type":"ContainerStarted","Data":"676b1cfd9def1384b48b7ef59e38915e16acce2d15294275b1cf62a7bb5ad9cc"} Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.208885 4735 generic.go:334] "Generic (PLEG): container finished" podID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" containerID="4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3" exitCode=0 Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.208957 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9sm9" event={"ID":"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4","Type":"ContainerDied","Data":"4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3"} Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.225605 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hx799" podStartSLOduration=2.487552922 podStartE2EDuration="34.225570767s" podCreationTimestamp="2025-10-01 10:19:29 +0000 UTC" firstStartedPulling="2025-10-01 10:19:30.968390026 +0000 UTC m=+129.661211288" lastFinishedPulling="2025-10-01 10:20:02.706407851 +0000 UTC m=+161.399229133" observedRunningTime="2025-10-01 10:20:03.221891453 +0000 UTC m=+161.914712745" watchObservedRunningTime="2025-10-01 10:20:03.225570767 +0000 UTC m=+161.918392029" Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.240838 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h9dbw" podStartSLOduration=2.18085234 podStartE2EDuration="36.240820841s" podCreationTimestamp="2025-10-01 10:19:27 +0000 UTC" firstStartedPulling="2025-10-01 10:19:28.936252112 +0000 UTC m=+127.629073374" lastFinishedPulling="2025-10-01 10:20:02.996220613 +0000 UTC m=+161.689041875" observedRunningTime="2025-10-01 10:20:03.239535094 +0000 UTC m=+161.932356376" watchObservedRunningTime="2025-10-01 10:20:03.240820841 +0000 UTC m=+161.933642103" Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.245247 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.247033 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.260283 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77b56a8b-1a27-4727-b45e-43fbc3847ddd-metrics-certs\") pod \"network-metrics-daemon-qm6mr\" (UID: \"77b56a8b-1a27-4727-b45e-43fbc3847ddd\") " pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.418261 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.427406 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm6mr" Oct 01 10:20:03 crc kubenswrapper[4735]: I1001 10:20:03.615054 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qm6mr"] Oct 01 10:20:04 crc kubenswrapper[4735]: I1001 10:20:04.216232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9sm9" event={"ID":"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4","Type":"ContainerStarted","Data":"991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de"} Oct 01 10:20:04 crc kubenswrapper[4735]: I1001 10:20:04.218995 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" event={"ID":"77b56a8b-1a27-4727-b45e-43fbc3847ddd","Type":"ContainerStarted","Data":"9fd5af8a33a90ba085826b47fc902cdf2066720c8ff875349a7595f659cd83e5"} Oct 01 10:20:04 crc kubenswrapper[4735]: I1001 10:20:04.219036 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" event={"ID":"77b56a8b-1a27-4727-b45e-43fbc3847ddd","Type":"ContainerStarted","Data":"35876280c55491711cf218799087a54f87359eccd306bbb154c1331f2d8cdd27"} Oct 01 10:20:04 crc kubenswrapper[4735]: I1001 10:20:04.219050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qm6mr" event={"ID":"77b56a8b-1a27-4727-b45e-43fbc3847ddd","Type":"ContainerStarted","Data":"14a8cf194641ff924730f08c23d4104fdd79d41e1129804b28feed594906d328"} Oct 01 10:20:04 crc kubenswrapper[4735]: I1001 10:20:04.241041 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d9sm9" podStartSLOduration=2.507583667 podStartE2EDuration="37.241023561s" podCreationTimestamp="2025-10-01 10:19:27 +0000 UTC" firstStartedPulling="2025-10-01 10:19:28.938738289 +0000 UTC m=+127.631559551" lastFinishedPulling="2025-10-01 10:20:03.672178183 +0000 UTC m=+162.364999445" observedRunningTime="2025-10-01 10:20:04.239019493 +0000 UTC m=+162.931840775" watchObservedRunningTime="2025-10-01 10:20:04.241023561 +0000 UTC m=+162.933844823" Oct 01 10:20:04 crc kubenswrapper[4735]: I1001 10:20:04.256145 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qm6mr" podStartSLOduration=143.25612741 podStartE2EDuration="2m23.25612741s" podCreationTimestamp="2025-10-01 10:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:20:04.255916164 +0000 UTC m=+162.948737426" watchObservedRunningTime="2025-10-01 10:20:04.25612741 +0000 UTC m=+162.948948672" Oct 01 10:20:05 crc kubenswrapper[4735]: I1001 10:20:05.486046 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:20:05 crc kubenswrapper[4735]: I1001 10:20:05.486690 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:20:07 crc kubenswrapper[4735]: I1001 10:20:07.949052 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:20:07 crc kubenswrapper[4735]: I1001 10:20:07.951449 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:20:08 crc kubenswrapper[4735]: I1001 10:20:08.080832 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:20:08 crc kubenswrapper[4735]: I1001 10:20:08.274802 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:20:08 crc kubenswrapper[4735]: I1001 10:20:08.345034 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:20:08 crc kubenswrapper[4735]: I1001 10:20:08.345091 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:20:08 crc kubenswrapper[4735]: I1001 10:20:08.383693 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:20:09 crc kubenswrapper[4735]: I1001 10:20:09.283030 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:20:09 crc kubenswrapper[4735]: I1001 10:20:09.910542 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:20:09 crc kubenswrapper[4735]: I1001 10:20:09.910608 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:20:09 crc kubenswrapper[4735]: I1001 10:20:09.953644 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:20:10 crc kubenswrapper[4735]: I1001 10:20:10.250773 4735 generic.go:334] "Generic (PLEG): container finished" podID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" containerID="d32ed0fd4818428beb73b5cc5b04c6ccba2fd7636ce929fbc22db9386b8a1936" exitCode=0 Oct 01 10:20:10 crc kubenswrapper[4735]: I1001 10:20:10.250981 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kq8q9" event={"ID":"3adb5d44-1cc9-4955-8ca4-a27b4e561251","Type":"ContainerDied","Data":"d32ed0fd4818428beb73b5cc5b04c6ccba2fd7636ce929fbc22db9386b8a1936"} Oct 01 10:20:10 crc kubenswrapper[4735]: I1001 10:20:10.289220 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:20:10 crc kubenswrapper[4735]: I1001 10:20:10.993739 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d9sm9"] Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.258221 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kq8q9" event={"ID":"3adb5d44-1cc9-4955-8ca4-a27b4e561251","Type":"ContainerStarted","Data":"5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f"} Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.258957 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d9sm9" podUID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" containerName="registry-server" containerID="cri-o://991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de" gracePeriod=2 Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.619308 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.640734 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kq8q9" podStartSLOduration=2.915742522 podStartE2EDuration="44.640719906s" podCreationTimestamp="2025-10-01 10:19:27 +0000 UTC" firstStartedPulling="2025-10-01 10:19:28.937811844 +0000 UTC m=+127.630633116" lastFinishedPulling="2025-10-01 10:20:10.662789228 +0000 UTC m=+169.355610500" observedRunningTime="2025-10-01 10:20:11.276635844 +0000 UTC m=+169.969457126" watchObservedRunningTime="2025-10-01 10:20:11.640719906 +0000 UTC m=+170.333541168" Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.747237 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-catalog-content\") pod \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.747327 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-utilities\") pod \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.748088 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-utilities" (OuterVolumeSpecName: "utilities") pod "5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" (UID: "5608a54e-a42f-4e14-aa3e-a5503c7a4dc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.748204 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqb7k\" (UniqueName: \"kubernetes.io/projected/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-kube-api-access-mqb7k\") pod \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\" (UID: \"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4\") " Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.749257 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.757883 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-kube-api-access-mqb7k" (OuterVolumeSpecName: "kube-api-access-mqb7k") pod "5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" (UID: "5608a54e-a42f-4e14-aa3e-a5503c7a4dc4"). InnerVolumeSpecName "kube-api-access-mqb7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.799651 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" (UID: "5608a54e-a42f-4e14-aa3e-a5503c7a4dc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.850165 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqb7k\" (UniqueName: \"kubernetes.io/projected/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-kube-api-access-mqb7k\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:11 crc kubenswrapper[4735]: I1001 10:20:11.850209 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.265065 4735 generic.go:334] "Generic (PLEG): container finished" podID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" containerID="991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de" exitCode=0 Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.265112 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9sm9" event={"ID":"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4","Type":"ContainerDied","Data":"991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de"} Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.265138 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9sm9" event={"ID":"5608a54e-a42f-4e14-aa3e-a5503c7a4dc4","Type":"ContainerDied","Data":"0d3dfd3fa57498da0ddf3bb0f95a9a76b7cbfac8e3d3b37076bad2e3afebbac8"} Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.265157 4735 scope.go:117] "RemoveContainer" containerID="991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de" Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.265192 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9sm9" Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.283523 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d9sm9"] Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.286923 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d9sm9"] Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.287002 4735 scope.go:117] "RemoveContainer" containerID="4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3" Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.306841 4735 scope.go:117] "RemoveContainer" containerID="3c4f44fa70bb4e00f8f21d1221983535a1fad684324a8fa59ff117354160c913" Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.322237 4735 scope.go:117] "RemoveContainer" containerID="991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de" Oct 01 10:20:12 crc kubenswrapper[4735]: E1001 10:20:12.322752 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de\": container with ID starting with 991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de not found: ID does not exist" containerID="991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de" Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.322787 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de"} err="failed to get container status \"991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de\": rpc error: code = NotFound desc = could not find container \"991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de\": container with ID starting with 991c2d379f559efd04bc48cd1a8a3718f2733d6a87e1480e48f4d3a45866b9de not found: ID does not exist" Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.322838 4735 scope.go:117] "RemoveContainer" containerID="4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3" Oct 01 10:20:12 crc kubenswrapper[4735]: E1001 10:20:12.325385 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3\": container with ID starting with 4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3 not found: ID does not exist" containerID="4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3" Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.325440 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3"} err="failed to get container status \"4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3\": rpc error: code = NotFound desc = could not find container \"4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3\": container with ID starting with 4436c6d314d1c2f17a73f3df1b51d2ad17c28d82d02c59a294c85930647eddf3 not found: ID does not exist" Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.325476 4735 scope.go:117] "RemoveContainer" containerID="3c4f44fa70bb4e00f8f21d1221983535a1fad684324a8fa59ff117354160c913" Oct 01 10:20:12 crc kubenswrapper[4735]: E1001 10:20:12.325842 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4f44fa70bb4e00f8f21d1221983535a1fad684324a8fa59ff117354160c913\": container with ID starting with 3c4f44fa70bb4e00f8f21d1221983535a1fad684324a8fa59ff117354160c913 not found: ID does not exist" containerID="3c4f44fa70bb4e00f8f21d1221983535a1fad684324a8fa59ff117354160c913" Oct 01 10:20:12 crc kubenswrapper[4735]: I1001 10:20:12.325873 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4f44fa70bb4e00f8f21d1221983535a1fad684324a8fa59ff117354160c913"} err="failed to get container status \"3c4f44fa70bb4e00f8f21d1221983535a1fad684324a8fa59ff117354160c913\": rpc error: code = NotFound desc = could not find container \"3c4f44fa70bb4e00f8f21d1221983535a1fad684324a8fa59ff117354160c913\": container with ID starting with 3c4f44fa70bb4e00f8f21d1221983535a1fad684324a8fa59ff117354160c913 not found: ID does not exist" Oct 01 10:20:13 crc kubenswrapper[4735]: I1001 10:20:13.913791 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" path="/var/lib/kubelet/pods/5608a54e-a42f-4e14-aa3e-a5503c7a4dc4/volumes" Oct 01 10:20:15 crc kubenswrapper[4735]: I1001 10:20:15.290752 4735 generic.go:334] "Generic (PLEG): container finished" podID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" containerID="205cdf8c9e62292efcdf0e729924db8bce490dc64e2317c9ceeced1cee58bbe5" exitCode=0 Oct 01 10:20:15 crc kubenswrapper[4735]: I1001 10:20:15.290832 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7j8p" event={"ID":"66aa965b-9b48-44f3-9d53-e1b0ff1829dd","Type":"ContainerDied","Data":"205cdf8c9e62292efcdf0e729924db8bce490dc64e2317c9ceeced1cee58bbe5"} Oct 01 10:20:16 crc kubenswrapper[4735]: I1001 10:20:16.299730 4735 generic.go:334] "Generic (PLEG): container finished" podID="0c167fd0-1242-4be7-8261-391836d921ae" containerID="629a7cd1ea5997552157d436f7ecb6619da9005088d4b9a3011a3d1e374e215e" exitCode=0 Oct 01 10:20:16 crc kubenswrapper[4735]: I1001 10:20:16.299818 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k844p" event={"ID":"0c167fd0-1242-4be7-8261-391836d921ae","Type":"ContainerDied","Data":"629a7cd1ea5997552157d436f7ecb6619da9005088d4b9a3011a3d1e374e215e"} Oct 01 10:20:16 crc kubenswrapper[4735]: I1001 10:20:16.303933 4735 generic.go:334] "Generic (PLEG): container finished" podID="e3285eed-d809-4408-af84-a2020d07c59c" containerID="3086162560d419d1388efc582c9724accd9d630bfdf2eb1f05dd151c37934b73" exitCode=0 Oct 01 10:20:16 crc kubenswrapper[4735]: I1001 10:20:16.304000 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nvtz" event={"ID":"e3285eed-d809-4408-af84-a2020d07c59c","Type":"ContainerDied","Data":"3086162560d419d1388efc582c9724accd9d630bfdf2eb1f05dd151c37934b73"} Oct 01 10:20:16 crc kubenswrapper[4735]: I1001 10:20:16.308127 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7j8p" event={"ID":"66aa965b-9b48-44f3-9d53-e1b0ff1829dd","Type":"ContainerStarted","Data":"e500d840d80ab66ec478a2be47c269a1c76aa72b4f97a6ced0fcaf2a61a8b0cb"} Oct 01 10:20:17 crc kubenswrapper[4735]: I1001 10:20:17.315464 4735 generic.go:334] "Generic (PLEG): container finished" podID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerID="0a03ae71be4d2da036bf63287a825a7f5bfb14dc6971443b24f782322f665a86" exitCode=0 Oct 01 10:20:17 crc kubenswrapper[4735]: I1001 10:20:17.315585 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z55n" event={"ID":"a7191c19-9c62-4c02-851c-1f871aed06f2","Type":"ContainerDied","Data":"0a03ae71be4d2da036bf63287a825a7f5bfb14dc6971443b24f782322f665a86"} Oct 01 10:20:17 crc kubenswrapper[4735]: I1001 10:20:17.318719 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k844p" event={"ID":"0c167fd0-1242-4be7-8261-391836d921ae","Type":"ContainerStarted","Data":"7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31"} Oct 01 10:20:17 crc kubenswrapper[4735]: I1001 10:20:17.321030 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nvtz" event={"ID":"e3285eed-d809-4408-af84-a2020d07c59c","Type":"ContainerStarted","Data":"e71b5c3094f25d9617018c313c229a930525f767b52525775e0883b66b4001e8"} Oct 01 10:20:17 crc kubenswrapper[4735]: I1001 10:20:17.338647 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x7j8p" podStartSLOduration=3.329235139 podStartE2EDuration="50.338624622s" podCreationTimestamp="2025-10-01 10:19:27 +0000 UTC" firstStartedPulling="2025-10-01 10:19:28.942274283 +0000 UTC m=+127.635095545" lastFinishedPulling="2025-10-01 10:20:15.951663766 +0000 UTC m=+174.644485028" observedRunningTime="2025-10-01 10:20:16.355070425 +0000 UTC m=+175.047891687" watchObservedRunningTime="2025-10-01 10:20:17.338624622 +0000 UTC m=+176.031445884" Oct 01 10:20:17 crc kubenswrapper[4735]: I1001 10:20:17.360838 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k844p" podStartSLOduration=3.352903477 podStartE2EDuration="48.360820572s" podCreationTimestamp="2025-10-01 10:19:29 +0000 UTC" firstStartedPulling="2025-10-01 10:19:31.998176462 +0000 UTC m=+130.690997724" lastFinishedPulling="2025-10-01 10:20:17.006093557 +0000 UTC m=+175.698914819" observedRunningTime="2025-10-01 10:20:17.356407437 +0000 UTC m=+176.049228699" watchObservedRunningTime="2025-10-01 10:20:17.360820572 +0000 UTC m=+176.053641834" Oct 01 10:20:17 crc kubenswrapper[4735]: I1001 10:20:17.378035 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6nvtz" podStartSLOduration=2.500876312 podStartE2EDuration="47.378018111s" podCreationTimestamp="2025-10-01 10:19:30 +0000 UTC" firstStartedPulling="2025-10-01 10:19:32.003754621 +0000 UTC m=+130.696575883" lastFinishedPulling="2025-10-01 10:20:16.88089641 +0000 UTC m=+175.573717682" observedRunningTime="2025-10-01 10:20:17.376152558 +0000 UTC m=+176.068973820" watchObservedRunningTime="2025-10-01 10:20:17.378018111 +0000 UTC m=+176.070839373" Oct 01 10:20:17 crc kubenswrapper[4735]: I1001 10:20:17.720300 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:20:17 crc kubenswrapper[4735]: I1001 10:20:17.720635 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:20:17 crc kubenswrapper[4735]: I1001 10:20:17.758343 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:20:18 crc kubenswrapper[4735]: I1001 10:20:18.110665 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:20:18 crc kubenswrapper[4735]: I1001 10:20:18.110722 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:20:18 crc kubenswrapper[4735]: I1001 10:20:18.147058 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:20:18 crc kubenswrapper[4735]: I1001 10:20:18.329192 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z55n" event={"ID":"a7191c19-9c62-4c02-851c-1f871aed06f2","Type":"ContainerStarted","Data":"144f26c9523452486d6ff8f604911d2109e4110b76990bd7f16917a384e098ca"} Oct 01 10:20:18 crc kubenswrapper[4735]: I1001 10:20:18.350837 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5z55n" podStartSLOduration=3.268214196 podStartE2EDuration="48.350814473s" podCreationTimestamp="2025-10-01 10:19:30 +0000 UTC" firstStartedPulling="2025-10-01 10:19:33.024133703 +0000 UTC m=+131.716954965" lastFinishedPulling="2025-10-01 10:20:18.10673397 +0000 UTC m=+176.799555242" observedRunningTime="2025-10-01 10:20:18.346094679 +0000 UTC m=+177.038915971" watchObservedRunningTime="2025-10-01 10:20:18.350814473 +0000 UTC m=+177.043635765" Oct 01 10:20:18 crc kubenswrapper[4735]: I1001 10:20:18.375189 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:20:19 crc kubenswrapper[4735]: I1001 10:20:19.194940 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kq8q9"] Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.340576 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kq8q9" podUID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" containerName="registry-server" containerID="cri-o://5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f" gracePeriod=2 Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.341647 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.341718 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.388995 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.681960 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.760787 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq245\" (UniqueName: \"kubernetes.io/projected/3adb5d44-1cc9-4955-8ca4-a27b4e561251-kube-api-access-lq245\") pod \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.761513 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-catalog-content\") pod \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.761604 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-utilities\") pod \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\" (UID: \"3adb5d44-1cc9-4955-8ca4-a27b4e561251\") " Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.762670 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-utilities" (OuterVolumeSpecName: "utilities") pod "3adb5d44-1cc9-4955-8ca4-a27b4e561251" (UID: "3adb5d44-1cc9-4955-8ca4-a27b4e561251"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.767993 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3adb5d44-1cc9-4955-8ca4-a27b4e561251-kube-api-access-lq245" (OuterVolumeSpecName: "kube-api-access-lq245") pod "3adb5d44-1cc9-4955-8ca4-a27b4e561251" (UID: "3adb5d44-1cc9-4955-8ca4-a27b4e561251"). InnerVolumeSpecName "kube-api-access-lq245". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.809368 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3adb5d44-1cc9-4955-8ca4-a27b4e561251" (UID: "3adb5d44-1cc9-4955-8ca4-a27b4e561251"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.863641 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq245\" (UniqueName: \"kubernetes.io/projected/3adb5d44-1cc9-4955-8ca4-a27b4e561251-kube-api-access-lq245\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.863700 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.863712 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3adb5d44-1cc9-4955-8ca4-a27b4e561251-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.924476 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:20:20 crc kubenswrapper[4735]: I1001 10:20:20.924553 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.320432 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.320529 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.360848 4735 generic.go:334] "Generic (PLEG): container finished" podID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" containerID="5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f" exitCode=0 Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.361534 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kq8q9" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.361700 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kq8q9" event={"ID":"3adb5d44-1cc9-4955-8ca4-a27b4e561251","Type":"ContainerDied","Data":"5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f"} Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.361735 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kq8q9" event={"ID":"3adb5d44-1cc9-4955-8ca4-a27b4e561251","Type":"ContainerDied","Data":"0ce88546cea0ce1839f623c0f211aaacb56d602300b0dd7929991ab561a8edd3"} Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.361756 4735 scope.go:117] "RemoveContainer" containerID="5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.380468 4735 scope.go:117] "RemoveContainer" containerID="d32ed0fd4818428beb73b5cc5b04c6ccba2fd7636ce929fbc22db9386b8a1936" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.395015 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kq8q9"] Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.398323 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kq8q9"] Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.407903 4735 scope.go:117] "RemoveContainer" containerID="dfcfd4ccf161be8a6da28520192e7c12b9c8bdf77dc7f95e830dc32b0ef52ce3" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.420237 4735 scope.go:117] "RemoveContainer" containerID="5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f" Oct 01 10:20:21 crc kubenswrapper[4735]: E1001 10:20:21.420767 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f\": container with ID starting with 5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f not found: ID does not exist" containerID="5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.420812 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f"} err="failed to get container status \"5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f\": rpc error: code = NotFound desc = could not find container \"5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f\": container with ID starting with 5f15b7d095bd5e26977802bfa8b49368ffc073c6683e2056ea05e53a1a752a3f not found: ID does not exist" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.420840 4735 scope.go:117] "RemoveContainer" containerID="d32ed0fd4818428beb73b5cc5b04c6ccba2fd7636ce929fbc22db9386b8a1936" Oct 01 10:20:21 crc kubenswrapper[4735]: E1001 10:20:21.421201 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d32ed0fd4818428beb73b5cc5b04c6ccba2fd7636ce929fbc22db9386b8a1936\": container with ID starting with d32ed0fd4818428beb73b5cc5b04c6ccba2fd7636ce929fbc22db9386b8a1936 not found: ID does not exist" containerID="d32ed0fd4818428beb73b5cc5b04c6ccba2fd7636ce929fbc22db9386b8a1936" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.421228 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32ed0fd4818428beb73b5cc5b04c6ccba2fd7636ce929fbc22db9386b8a1936"} err="failed to get container status \"d32ed0fd4818428beb73b5cc5b04c6ccba2fd7636ce929fbc22db9386b8a1936\": rpc error: code = NotFound desc = could not find container \"d32ed0fd4818428beb73b5cc5b04c6ccba2fd7636ce929fbc22db9386b8a1936\": container with ID starting with d32ed0fd4818428beb73b5cc5b04c6ccba2fd7636ce929fbc22db9386b8a1936 not found: ID does not exist" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.421248 4735 scope.go:117] "RemoveContainer" containerID="dfcfd4ccf161be8a6da28520192e7c12b9c8bdf77dc7f95e830dc32b0ef52ce3" Oct 01 10:20:21 crc kubenswrapper[4735]: E1001 10:20:21.421513 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfcfd4ccf161be8a6da28520192e7c12b9c8bdf77dc7f95e830dc32b0ef52ce3\": container with ID starting with dfcfd4ccf161be8a6da28520192e7c12b9c8bdf77dc7f95e830dc32b0ef52ce3 not found: ID does not exist" containerID="dfcfd4ccf161be8a6da28520192e7c12b9c8bdf77dc7f95e830dc32b0ef52ce3" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.421614 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfcfd4ccf161be8a6da28520192e7c12b9c8bdf77dc7f95e830dc32b0ef52ce3"} err="failed to get container status \"dfcfd4ccf161be8a6da28520192e7c12b9c8bdf77dc7f95e830dc32b0ef52ce3\": rpc error: code = NotFound desc = could not find container \"dfcfd4ccf161be8a6da28520192e7c12b9c8bdf77dc7f95e830dc32b0ef52ce3\": container with ID starting with dfcfd4ccf161be8a6da28520192e7c12b9c8bdf77dc7f95e830dc32b0ef52ce3 not found: ID does not exist" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.903629 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" path="/var/lib/kubelet/pods/3adb5d44-1cc9-4955-8ca4-a27b4e561251/volumes" Oct 01 10:20:21 crc kubenswrapper[4735]: I1001 10:20:21.972548 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6nvtz" podUID="e3285eed-d809-4408-af84-a2020d07c59c" containerName="registry-server" probeResult="failure" output=< Oct 01 10:20:21 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 01 10:20:21 crc kubenswrapper[4735]: > Oct 01 10:20:22 crc kubenswrapper[4735]: I1001 10:20:22.366002 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5z55n" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerName="registry-server" probeResult="failure" output=< Oct 01 10:20:22 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 01 10:20:22 crc kubenswrapper[4735]: > Oct 01 10:20:27 crc kubenswrapper[4735]: I1001 10:20:27.762269 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.379487 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.434021 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k844p"] Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.434218 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k844p" podUID="0c167fd0-1242-4be7-8261-391836d921ae" containerName="registry-server" containerID="cri-o://7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31" gracePeriod=2 Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.748901 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.799579 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-utilities\") pod \"0c167fd0-1242-4be7-8261-391836d921ae\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.799636 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-catalog-content\") pod \"0c167fd0-1242-4be7-8261-391836d921ae\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.799698 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44sbb\" (UniqueName: \"kubernetes.io/projected/0c167fd0-1242-4be7-8261-391836d921ae-kube-api-access-44sbb\") pod \"0c167fd0-1242-4be7-8261-391836d921ae\" (UID: \"0c167fd0-1242-4be7-8261-391836d921ae\") " Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.800444 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-utilities" (OuterVolumeSpecName: "utilities") pod "0c167fd0-1242-4be7-8261-391836d921ae" (UID: "0c167fd0-1242-4be7-8261-391836d921ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.806659 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c167fd0-1242-4be7-8261-391836d921ae-kube-api-access-44sbb" (OuterVolumeSpecName: "kube-api-access-44sbb") pod "0c167fd0-1242-4be7-8261-391836d921ae" (UID: "0c167fd0-1242-4be7-8261-391836d921ae"). InnerVolumeSpecName "kube-api-access-44sbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.813685 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c167fd0-1242-4be7-8261-391836d921ae" (UID: "0c167fd0-1242-4be7-8261-391836d921ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.901632 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.901664 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c167fd0-1242-4be7-8261-391836d921ae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.901677 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44sbb\" (UniqueName: \"kubernetes.io/projected/0c167fd0-1242-4be7-8261-391836d921ae-kube-api-access-44sbb\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:30 crc kubenswrapper[4735]: I1001 10:20:30.962622 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.002703 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.357449 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.394890 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.419015 4735 generic.go:334] "Generic (PLEG): container finished" podID="0c167fd0-1242-4be7-8261-391836d921ae" containerID="7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31" exitCode=0 Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.419077 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k844p" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.419151 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k844p" event={"ID":"0c167fd0-1242-4be7-8261-391836d921ae","Type":"ContainerDied","Data":"7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31"} Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.419183 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k844p" event={"ID":"0c167fd0-1242-4be7-8261-391836d921ae","Type":"ContainerDied","Data":"32db815162b96c231fcf92135f41c789676f03e56280c42a395a7424dd1378e9"} Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.419210 4735 scope.go:117] "RemoveContainer" containerID="7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.442588 4735 scope.go:117] "RemoveContainer" containerID="629a7cd1ea5997552157d436f7ecb6619da9005088d4b9a3011a3d1e374e215e" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.445210 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k844p"] Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.448742 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k844p"] Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.467577 4735 scope.go:117] "RemoveContainer" containerID="829a34e3557f2f5a309628835ac9571058879b0f2dcfa81d77d4634fc42b3098" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.483041 4735 scope.go:117] "RemoveContainer" containerID="7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31" Oct 01 10:20:31 crc kubenswrapper[4735]: E1001 10:20:31.483475 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31\": container with ID starting with 7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31 not found: ID does not exist" containerID="7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.483554 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31"} err="failed to get container status \"7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31\": rpc error: code = NotFound desc = could not find container \"7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31\": container with ID starting with 7f9c04079b076e823f5375dd0381b1e04b12920bcd9b83979430283d5c0c8e31 not found: ID does not exist" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.483579 4735 scope.go:117] "RemoveContainer" containerID="629a7cd1ea5997552157d436f7ecb6619da9005088d4b9a3011a3d1e374e215e" Oct 01 10:20:31 crc kubenswrapper[4735]: E1001 10:20:31.483892 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"629a7cd1ea5997552157d436f7ecb6619da9005088d4b9a3011a3d1e374e215e\": container with ID starting with 629a7cd1ea5997552157d436f7ecb6619da9005088d4b9a3011a3d1e374e215e not found: ID does not exist" containerID="629a7cd1ea5997552157d436f7ecb6619da9005088d4b9a3011a3d1e374e215e" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.483929 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629a7cd1ea5997552157d436f7ecb6619da9005088d4b9a3011a3d1e374e215e"} err="failed to get container status \"629a7cd1ea5997552157d436f7ecb6619da9005088d4b9a3011a3d1e374e215e\": rpc error: code = NotFound desc = could not find container \"629a7cd1ea5997552157d436f7ecb6619da9005088d4b9a3011a3d1e374e215e\": container with ID starting with 629a7cd1ea5997552157d436f7ecb6619da9005088d4b9a3011a3d1e374e215e not found: ID does not exist" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.483947 4735 scope.go:117] "RemoveContainer" containerID="829a34e3557f2f5a309628835ac9571058879b0f2dcfa81d77d4634fc42b3098" Oct 01 10:20:31 crc kubenswrapper[4735]: E1001 10:20:31.484256 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"829a34e3557f2f5a309628835ac9571058879b0f2dcfa81d77d4634fc42b3098\": container with ID starting with 829a34e3557f2f5a309628835ac9571058879b0f2dcfa81d77d4634fc42b3098 not found: ID does not exist" containerID="829a34e3557f2f5a309628835ac9571058879b0f2dcfa81d77d4634fc42b3098" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.484307 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"829a34e3557f2f5a309628835ac9571058879b0f2dcfa81d77d4634fc42b3098"} err="failed to get container status \"829a34e3557f2f5a309628835ac9571058879b0f2dcfa81d77d4634fc42b3098\": rpc error: code = NotFound desc = could not find container \"829a34e3557f2f5a309628835ac9571058879b0f2dcfa81d77d4634fc42b3098\": container with ID starting with 829a34e3557f2f5a309628835ac9571058879b0f2dcfa81d77d4634fc42b3098 not found: ID does not exist" Oct 01 10:20:31 crc kubenswrapper[4735]: I1001 10:20:31.902523 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c167fd0-1242-4be7-8261-391836d921ae" path="/var/lib/kubelet/pods/0c167fd0-1242-4be7-8261-391836d921ae/volumes" Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.193062 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5z55n"] Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.193608 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5z55n" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerName="registry-server" containerID="cri-o://144f26c9523452486d6ff8f604911d2109e4110b76990bd7f16917a384e098ca" gracePeriod=2 Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.432406 4735 generic.go:334] "Generic (PLEG): container finished" podID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerID="144f26c9523452486d6ff8f604911d2109e4110b76990bd7f16917a384e098ca" exitCode=0 Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.432448 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z55n" event={"ID":"a7191c19-9c62-4c02-851c-1f871aed06f2","Type":"ContainerDied","Data":"144f26c9523452486d6ff8f604911d2109e4110b76990bd7f16917a384e098ca"} Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.640976 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.735069 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtr5n\" (UniqueName: \"kubernetes.io/projected/a7191c19-9c62-4c02-851c-1f871aed06f2-kube-api-access-jtr5n\") pod \"a7191c19-9c62-4c02-851c-1f871aed06f2\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.735171 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-utilities\") pod \"a7191c19-9c62-4c02-851c-1f871aed06f2\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.735259 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-catalog-content\") pod \"a7191c19-9c62-4c02-851c-1f871aed06f2\" (UID: \"a7191c19-9c62-4c02-851c-1f871aed06f2\") " Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.736367 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-utilities" (OuterVolumeSpecName: "utilities") pod "a7191c19-9c62-4c02-851c-1f871aed06f2" (UID: "a7191c19-9c62-4c02-851c-1f871aed06f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.740031 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7191c19-9c62-4c02-851c-1f871aed06f2-kube-api-access-jtr5n" (OuterVolumeSpecName: "kube-api-access-jtr5n") pod "a7191c19-9c62-4c02-851c-1f871aed06f2" (UID: "a7191c19-9c62-4c02-851c-1f871aed06f2"). InnerVolumeSpecName "kube-api-access-jtr5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.818739 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7191c19-9c62-4c02-851c-1f871aed06f2" (UID: "a7191c19-9c62-4c02-851c-1f871aed06f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.837015 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.837049 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7191c19-9c62-4c02-851c-1f871aed06f2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:33 crc kubenswrapper[4735]: I1001 10:20:33.837060 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtr5n\" (UniqueName: \"kubernetes.io/projected/a7191c19-9c62-4c02-851c-1f871aed06f2-kube-api-access-jtr5n\") on node \"crc\" DevicePath \"\"" Oct 01 10:20:34 crc kubenswrapper[4735]: I1001 10:20:34.439652 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z55n" event={"ID":"a7191c19-9c62-4c02-851c-1f871aed06f2","Type":"ContainerDied","Data":"2b6348e36a0c789c5650b286baeb2be09925098b49d6c3298233b9c3055b316d"} Oct 01 10:20:34 crc kubenswrapper[4735]: I1001 10:20:34.439705 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z55n" Oct 01 10:20:34 crc kubenswrapper[4735]: I1001 10:20:34.439717 4735 scope.go:117] "RemoveContainer" containerID="144f26c9523452486d6ff8f604911d2109e4110b76990bd7f16917a384e098ca" Oct 01 10:20:34 crc kubenswrapper[4735]: I1001 10:20:34.452815 4735 scope.go:117] "RemoveContainer" containerID="0a03ae71be4d2da036bf63287a825a7f5bfb14dc6971443b24f782322f665a86" Oct 01 10:20:34 crc kubenswrapper[4735]: I1001 10:20:34.458321 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5z55n"] Oct 01 10:20:34 crc kubenswrapper[4735]: I1001 10:20:34.462192 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5z55n"] Oct 01 10:20:34 crc kubenswrapper[4735]: I1001 10:20:34.467749 4735 scope.go:117] "RemoveContainer" containerID="828640d5c654ee463a83e8c556eaf621d0ea7740ce62bb9adf61025040e32bad" Oct 01 10:20:34 crc kubenswrapper[4735]: I1001 10:20:34.808027 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rs7ln"] Oct 01 10:20:35 crc kubenswrapper[4735]: I1001 10:20:35.485453 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:20:35 crc kubenswrapper[4735]: I1001 10:20:35.485820 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:20:35 crc kubenswrapper[4735]: I1001 10:20:35.902521 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" path="/var/lib/kubelet/pods/a7191c19-9c62-4c02-851c-1f871aed06f2/volumes" Oct 01 10:20:39 crc kubenswrapper[4735]: I1001 10:20:39.228890 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 10:20:59 crc kubenswrapper[4735]: I1001 10:20:59.834781 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" podUID="6718bff7-4d68-4aa2-ad2b-1511e0799683" containerName="oauth-openshift" containerID="cri-o://7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b" gracePeriod=15 Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.149696 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.180817 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7494c98dcc-97jnj"] Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181016 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6718bff7-4d68-4aa2-ad2b-1511e0799683" containerName="oauth-openshift" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181027 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6718bff7-4d68-4aa2-ad2b-1511e0799683" containerName="oauth-openshift" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181039 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerName="extract-utilities" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181045 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerName="extract-utilities" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181054 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" containerName="extract-utilities" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181059 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" containerName="extract-utilities" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181080 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181086 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181097 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" containerName="extract-utilities" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181102 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" containerName="extract-utilities" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181109 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c167fd0-1242-4be7-8261-391836d921ae" containerName="extract-content" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181114 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c167fd0-1242-4be7-8261-391836d921ae" containerName="extract-content" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181123 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" containerName="extract-content" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181128 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" containerName="extract-content" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181137 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c167fd0-1242-4be7-8261-391836d921ae" containerName="extract-utilities" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181142 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c167fd0-1242-4be7-8261-391836d921ae" containerName="extract-utilities" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181150 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" containerName="extract-content" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181155 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" containerName="extract-content" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181163 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181168 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181178 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c167fd0-1242-4be7-8261-391836d921ae" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181183 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c167fd0-1242-4be7-8261-391836d921ae" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181191 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerName="extract-content" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181196 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerName="extract-content" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181204 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ff8717-b8b7-445a-a152-4c1aab50702d" containerName="pruner" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181210 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ff8717-b8b7-445a-a152-4c1aab50702d" containerName="pruner" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.181217 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181223 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181309 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5608a54e-a42f-4e14-aa3e-a5503c7a4dc4" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181318 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6718bff7-4d68-4aa2-ad2b-1511e0799683" containerName="oauth-openshift" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181325 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c167fd0-1242-4be7-8261-391836d921ae" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181334 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ff8717-b8b7-445a-a152-4c1aab50702d" containerName="pruner" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181341 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3adb5d44-1cc9-4955-8ca4-a27b4e561251" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181349 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7191c19-9c62-4c02-851c-1f871aed06f2" containerName="registry-server" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.181740 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.191037 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7494c98dcc-97jnj"] Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.248542 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-error\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.248652 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-trusted-ca-bundle\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.248715 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-service-ca\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.248790 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-router-certs\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.248834 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-cliconfig\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.248894 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpjrb\" (UniqueName: \"kubernetes.io/projected/6718bff7-4d68-4aa2-ad2b-1511e0799683-kube-api-access-bpjrb\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.249593 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-provider-selection\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.249699 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-login\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.249683 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.249697 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.249712 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.249758 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-serving-cert\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.249892 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-ocp-branding-template\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.249997 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-session\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250031 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-policies\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250080 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-idp-0-file-data\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250114 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-dir\") pod \"6718bff7-4d68-4aa2-ad2b-1511e0799683\" (UID: \"6718bff7-4d68-4aa2-ad2b-1511e0799683\") " Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250372 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-template-login\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250446 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250520 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-router-certs\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250559 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-session\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250596 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250678 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250766 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-service-ca\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250789 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-template-error\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250854 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250879 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250911 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a21f0a66-5049-4223-b9b1-2cfa43ac9854-audit-dir\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250935 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.250985 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-audit-policies\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.251015 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l874p\" (UniqueName: \"kubernetes.io/projected/a21f0a66-5049-4223-b9b1-2cfa43ac9854-kube-api-access-l874p\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.251078 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.251091 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.251103 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.251970 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.252695 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.255975 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.256378 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.256746 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.257168 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.257260 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6718bff7-4d68-4aa2-ad2b-1511e0799683-kube-api-access-bpjrb" (OuterVolumeSpecName: "kube-api-access-bpjrb") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "kube-api-access-bpjrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.257396 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.257626 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.257844 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.258098 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6718bff7-4d68-4aa2-ad2b-1511e0799683" (UID: "6718bff7-4d68-4aa2-ad2b-1511e0799683"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l874p\" (UniqueName: \"kubernetes.io/projected/a21f0a66-5049-4223-b9b1-2cfa43ac9854-kube-api-access-l874p\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352221 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-template-login\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352252 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352278 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-router-certs\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352313 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-session\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352341 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352375 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352412 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-service-ca\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352435 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-template-error\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352463 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352484 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352528 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a21f0a66-5049-4223-b9b1-2cfa43ac9854-audit-dir\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352551 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352580 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-audit-policies\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352629 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352645 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352657 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpjrb\" (UniqueName: \"kubernetes.io/projected/6718bff7-4d68-4aa2-ad2b-1511e0799683-kube-api-access-bpjrb\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352670 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.352769 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.353137 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a21f0a66-5049-4223-b9b1-2cfa43ac9854-audit-dir\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.353593 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-service-ca\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.353634 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-audit-policies\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.353650 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.354166 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.354194 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.354209 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.354196 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.354223 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.354222 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6718bff7-4d68-4aa2-ad2b-1511e0799683-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.354278 4735 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6718bff7-4d68-4aa2-ad2b-1511e0799683-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.356422 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-router-certs\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.356848 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-template-login\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.357346 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.357353 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.357538 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-template-error\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.357733 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.358903 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.359801 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a21f0a66-5049-4223-b9b1-2cfa43ac9854-v4-0-config-system-session\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.369160 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l874p\" (UniqueName: \"kubernetes.io/projected/a21f0a66-5049-4223-b9b1-2cfa43ac9854-kube-api-access-l874p\") pod \"oauth-openshift-7494c98dcc-97jnj\" (UID: \"a21f0a66-5049-4223-b9b1-2cfa43ac9854\") " pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.501669 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.581774 4735 generic.go:334] "Generic (PLEG): container finished" podID="6718bff7-4d68-4aa2-ad2b-1511e0799683" containerID="7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b" exitCode=0 Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.581988 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" event={"ID":"6718bff7-4d68-4aa2-ad2b-1511e0799683","Type":"ContainerDied","Data":"7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b"} Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.582514 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" event={"ID":"6718bff7-4d68-4aa2-ad2b-1511e0799683","Type":"ContainerDied","Data":"3cd54128a19b4a8fea5063b4ececd3e84921cc148f61cb26e80a75480bf4f83b"} Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.582551 4735 scope.go:117] "RemoveContainer" containerID="7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.582093 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rs7ln" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.621374 4735 scope.go:117] "RemoveContainer" containerID="7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b" Oct 01 10:21:00 crc kubenswrapper[4735]: E1001 10:21:00.622192 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b\": container with ID starting with 7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b not found: ID does not exist" containerID="7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.622231 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b"} err="failed to get container status \"7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b\": rpc error: code = NotFound desc = could not find container \"7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b\": container with ID starting with 7113c55e7631ee46bd97a454731e1cd9f8656953a1e03ee0c1649f71a146907b not found: ID does not exist" Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.625125 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rs7ln"] Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.631223 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rs7ln"] Oct 01 10:21:00 crc kubenswrapper[4735]: I1001 10:21:00.675792 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7494c98dcc-97jnj"] Oct 01 10:21:01 crc kubenswrapper[4735]: I1001 10:21:01.591331 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" event={"ID":"a21f0a66-5049-4223-b9b1-2cfa43ac9854","Type":"ContainerStarted","Data":"ceb5a0b72160a31acfc6f23b5778a4e429bcb284cb2583fe3de5e793c6639de5"} Oct 01 10:21:01 crc kubenswrapper[4735]: I1001 10:21:01.591727 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" event={"ID":"a21f0a66-5049-4223-b9b1-2cfa43ac9854","Type":"ContainerStarted","Data":"579fa1e12c804b1693f89f0fb1226d5690c233f87288e6c972477b8bb1517808"} Oct 01 10:21:01 crc kubenswrapper[4735]: I1001 10:21:01.591859 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:01 crc kubenswrapper[4735]: I1001 10:21:01.596565 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" Oct 01 10:21:01 crc kubenswrapper[4735]: I1001 10:21:01.610019 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7494c98dcc-97jnj" podStartSLOduration=27.609996897 podStartE2EDuration="27.609996897s" podCreationTimestamp="2025-10-01 10:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:21:01.608050483 +0000 UTC m=+220.300871745" watchObservedRunningTime="2025-10-01 10:21:01.609996897 +0000 UTC m=+220.302818159" Oct 01 10:21:01 crc kubenswrapper[4735]: I1001 10:21:01.903244 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6718bff7-4d68-4aa2-ad2b-1511e0799683" path="/var/lib/kubelet/pods/6718bff7-4d68-4aa2-ad2b-1511e0799683/volumes" Oct 01 10:21:05 crc kubenswrapper[4735]: I1001 10:21:05.485661 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:21:05 crc kubenswrapper[4735]: I1001 10:21:05.486165 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:21:05 crc kubenswrapper[4735]: I1001 10:21:05.486228 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:21:05 crc kubenswrapper[4735]: I1001 10:21:05.486885 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 10:21:05 crc kubenswrapper[4735]: I1001 10:21:05.486945 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c" gracePeriod=600 Oct 01 10:21:05 crc kubenswrapper[4735]: I1001 10:21:05.613526 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c" exitCode=0 Oct 01 10:21:05 crc kubenswrapper[4735]: I1001 10:21:05.613566 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c"} Oct 01 10:21:06 crc kubenswrapper[4735]: I1001 10:21:06.623899 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"37cba53e290bdc38e83217d84214f3378c3c1355865e08cfd659f1334766bc2e"} Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.416932 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x7j8p"] Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.417740 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x7j8p" podUID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" containerName="registry-server" containerID="cri-o://e500d840d80ab66ec478a2be47c269a1c76aa72b4f97a6ced0fcaf2a61a8b0cb" gracePeriod=30 Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.424724 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h9dbw"] Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.425099 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h9dbw" podUID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" containerName="registry-server" containerID="cri-o://676b1cfd9def1384b48b7ef59e38915e16acce2d15294275b1cf62a7bb5ad9cc" gracePeriod=30 Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.431372 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqpz2"] Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.431670 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" podUID="b79ac2f9-c8bc-4893-ace2-ca598d77ff52" containerName="marketplace-operator" containerID="cri-o://8e45bfb355a4b3adea101c5d6b0e92f94f799dea4d3be26d4ef2eed8c62dc1c9" gracePeriod=30 Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.443006 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hx799"] Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.443289 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hx799" podUID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" containerName="registry-server" containerID="cri-o://9b6258e8be5cb301106d12900e8541b471227238253b0560400711cd77b8c8db" gracePeriod=30 Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.466894 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6nvtz"] Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.467478 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6nvtz" podUID="e3285eed-d809-4408-af84-a2020d07c59c" containerName="registry-server" containerID="cri-o://e71b5c3094f25d9617018c313c229a930525f767b52525775e0883b66b4001e8" gracePeriod=30 Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.482762 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-twfwh"] Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.483638 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.485764 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-twfwh"] Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.593279 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94467536-0aa2-426e-a14a-bb05c8afd56c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-twfwh\" (UID: \"94467536-0aa2-426e-a14a-bb05c8afd56c\") " pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.593411 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/94467536-0aa2-426e-a14a-bb05c8afd56c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-twfwh\" (UID: \"94467536-0aa2-426e-a14a-bb05c8afd56c\") " pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.593489 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6rpn\" (UniqueName: \"kubernetes.io/projected/94467536-0aa2-426e-a14a-bb05c8afd56c-kube-api-access-d6rpn\") pod \"marketplace-operator-79b997595-twfwh\" (UID: \"94467536-0aa2-426e-a14a-bb05c8afd56c\") " pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.650381 4735 generic.go:334] "Generic (PLEG): container finished" podID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" containerID="e500d840d80ab66ec478a2be47c269a1c76aa72b4f97a6ced0fcaf2a61a8b0cb" exitCode=0 Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.650442 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7j8p" event={"ID":"66aa965b-9b48-44f3-9d53-e1b0ff1829dd","Type":"ContainerDied","Data":"e500d840d80ab66ec478a2be47c269a1c76aa72b4f97a6ced0fcaf2a61a8b0cb"} Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.651401 4735 generic.go:334] "Generic (PLEG): container finished" podID="b79ac2f9-c8bc-4893-ace2-ca598d77ff52" containerID="8e45bfb355a4b3adea101c5d6b0e92f94f799dea4d3be26d4ef2eed8c62dc1c9" exitCode=0 Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.651433 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" event={"ID":"b79ac2f9-c8bc-4893-ace2-ca598d77ff52","Type":"ContainerDied","Data":"8e45bfb355a4b3adea101c5d6b0e92f94f799dea4d3be26d4ef2eed8c62dc1c9"} Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.653916 4735 generic.go:334] "Generic (PLEG): container finished" podID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" containerID="9b6258e8be5cb301106d12900e8541b471227238253b0560400711cd77b8c8db" exitCode=0 Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.653966 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hx799" event={"ID":"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6","Type":"ContainerDied","Data":"9b6258e8be5cb301106d12900e8541b471227238253b0560400711cd77b8c8db"} Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.655281 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" containerID="676b1cfd9def1384b48b7ef59e38915e16acce2d15294275b1cf62a7bb5ad9cc" exitCode=0 Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.655318 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9dbw" event={"ID":"ac706b67-93f2-4ccd-a0e8-7ebb309bc905","Type":"ContainerDied","Data":"676b1cfd9def1384b48b7ef59e38915e16acce2d15294275b1cf62a7bb5ad9cc"} Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.656516 4735 generic.go:334] "Generic (PLEG): container finished" podID="e3285eed-d809-4408-af84-a2020d07c59c" containerID="e71b5c3094f25d9617018c313c229a930525f767b52525775e0883b66b4001e8" exitCode=0 Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.656532 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nvtz" event={"ID":"e3285eed-d809-4408-af84-a2020d07c59c","Type":"ContainerDied","Data":"e71b5c3094f25d9617018c313c229a930525f767b52525775e0883b66b4001e8"} Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.695383 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6rpn\" (UniqueName: \"kubernetes.io/projected/94467536-0aa2-426e-a14a-bb05c8afd56c-kube-api-access-d6rpn\") pod \"marketplace-operator-79b997595-twfwh\" (UID: \"94467536-0aa2-426e-a14a-bb05c8afd56c\") " pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.695533 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94467536-0aa2-426e-a14a-bb05c8afd56c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-twfwh\" (UID: \"94467536-0aa2-426e-a14a-bb05c8afd56c\") " pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.695557 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/94467536-0aa2-426e-a14a-bb05c8afd56c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-twfwh\" (UID: \"94467536-0aa2-426e-a14a-bb05c8afd56c\") " pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.697254 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94467536-0aa2-426e-a14a-bb05c8afd56c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-twfwh\" (UID: \"94467536-0aa2-426e-a14a-bb05c8afd56c\") " pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.702273 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/94467536-0aa2-426e-a14a-bb05c8afd56c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-twfwh\" (UID: \"94467536-0aa2-426e-a14a-bb05c8afd56c\") " pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.714316 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6rpn\" (UniqueName: \"kubernetes.io/projected/94467536-0aa2-426e-a14a-bb05c8afd56c-kube-api-access-d6rpn\") pod \"marketplace-operator-79b997595-twfwh\" (UID: \"94467536-0aa2-426e-a14a-bb05c8afd56c\") " pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.897704 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.903048 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.907542 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.913004 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.927592 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.934299 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.998385 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-utilities\") pod \"e3285eed-d809-4408-af84-a2020d07c59c\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.998422 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gckxs\" (UniqueName: \"kubernetes.io/projected/e3285eed-d809-4408-af84-a2020d07c59c-kube-api-access-gckxs\") pod \"e3285eed-d809-4408-af84-a2020d07c59c\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.998439 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls2pr\" (UniqueName: \"kubernetes.io/projected/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-kube-api-access-ls2pr\") pod \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " Oct 01 10:21:11 crc kubenswrapper[4735]: I1001 10:21:11.998463 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-catalog-content\") pod \"e3285eed-d809-4408-af84-a2020d07c59c\" (UID: \"e3285eed-d809-4408-af84-a2020d07c59c\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:11.999345 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-utilities\") pod \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:11.999389 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75zht\" (UniqueName: \"kubernetes.io/projected/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-kube-api-access-75zht\") pod \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:11.999420 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-operator-metrics\") pod \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:11.999693 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-trusted-ca\") pod \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:11.999713 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-catalog-content\") pod \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:11.999738 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-utilities\") pod \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:11.999759 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9knlc\" (UniqueName: \"kubernetes.io/projected/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-kube-api-access-9knlc\") pod \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\" (UID: \"b79ac2f9-c8bc-4893-ace2-ca598d77ff52\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:11.999785 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj856\" (UniqueName: \"kubernetes.io/projected/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-kube-api-access-dj856\") pod \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:11.999799 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-utilities\") pod \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\" (UID: \"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:11.999819 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-catalog-content\") pod \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\" (UID: \"66aa965b-9b48-44f3-9d53-e1b0ff1829dd\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:11.999845 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-catalog-content\") pod \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\" (UID: \"ac706b67-93f2-4ccd-a0e8-7ebb309bc905\") " Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.001448 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b79ac2f9-c8bc-4893-ace2-ca598d77ff52" (UID: "b79ac2f9-c8bc-4893-ace2-ca598d77ff52"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.004289 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-utilities" (OuterVolumeSpecName: "utilities") pod "e3285eed-d809-4408-af84-a2020d07c59c" (UID: "e3285eed-d809-4408-af84-a2020d07c59c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.008803 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-utilities" (OuterVolumeSpecName: "utilities") pod "66aa965b-9b48-44f3-9d53-e1b0ff1829dd" (UID: "66aa965b-9b48-44f3-9d53-e1b0ff1829dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.010772 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-kube-api-access-ls2pr" (OuterVolumeSpecName: "kube-api-access-ls2pr") pod "b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" (UID: "b2ae6c32-20b1-4c6b-872d-8c72c5a28af6"). InnerVolumeSpecName "kube-api-access-ls2pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.011383 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-utilities" (OuterVolumeSpecName: "utilities") pod "ac706b67-93f2-4ccd-a0e8-7ebb309bc905" (UID: "ac706b67-93f2-4ccd-a0e8-7ebb309bc905"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.011624 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-utilities" (OuterVolumeSpecName: "utilities") pod "b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" (UID: "b2ae6c32-20b1-4c6b-872d-8c72c5a28af6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.011578 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-kube-api-access-9knlc" (OuterVolumeSpecName: "kube-api-access-9knlc") pod "b79ac2f9-c8bc-4893-ace2-ca598d77ff52" (UID: "b79ac2f9-c8bc-4893-ace2-ca598d77ff52"). InnerVolumeSpecName "kube-api-access-9knlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.013973 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3285eed-d809-4408-af84-a2020d07c59c-kube-api-access-gckxs" (OuterVolumeSpecName: "kube-api-access-gckxs") pod "e3285eed-d809-4408-af84-a2020d07c59c" (UID: "e3285eed-d809-4408-af84-a2020d07c59c"). InnerVolumeSpecName "kube-api-access-gckxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.014311 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-kube-api-access-dj856" (OuterVolumeSpecName: "kube-api-access-dj856") pod "66aa965b-9b48-44f3-9d53-e1b0ff1829dd" (UID: "66aa965b-9b48-44f3-9d53-e1b0ff1829dd"). InnerVolumeSpecName "kube-api-access-dj856". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.014405 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-kube-api-access-75zht" (OuterVolumeSpecName: "kube-api-access-75zht") pod "ac706b67-93f2-4ccd-a0e8-7ebb309bc905" (UID: "ac706b67-93f2-4ccd-a0e8-7ebb309bc905"). InnerVolumeSpecName "kube-api-access-75zht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.014913 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b79ac2f9-c8bc-4893-ace2-ca598d77ff52" (UID: "b79ac2f9-c8bc-4893-ace2-ca598d77ff52"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.035952 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" (UID: "b2ae6c32-20b1-4c6b-872d-8c72c5a28af6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.064124 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66aa965b-9b48-44f3-9d53-e1b0ff1829dd" (UID: "66aa965b-9b48-44f3-9d53-e1b0ff1829dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.087371 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac706b67-93f2-4ccd-a0e8-7ebb309bc905" (UID: "ac706b67-93f2-4ccd-a0e8-7ebb309bc905"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103210 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103242 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gckxs\" (UniqueName: \"kubernetes.io/projected/e3285eed-d809-4408-af84-a2020d07c59c-kube-api-access-gckxs\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103254 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls2pr\" (UniqueName: \"kubernetes.io/projected/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-kube-api-access-ls2pr\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103263 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103271 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75zht\" (UniqueName: \"kubernetes.io/projected/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-kube-api-access-75zht\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103279 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103311 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103320 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103327 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103335 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9knlc\" (UniqueName: \"kubernetes.io/projected/b79ac2f9-c8bc-4893-ace2-ca598d77ff52-kube-api-access-9knlc\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103346 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj856\" (UniqueName: \"kubernetes.io/projected/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-kube-api-access-dj856\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103355 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103364 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66aa965b-9b48-44f3-9d53-e1b0ff1829dd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.103374 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac706b67-93f2-4ccd-a0e8-7ebb309bc905-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.104797 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-twfwh"] Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.136335 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3285eed-d809-4408-af84-a2020d07c59c" (UID: "e3285eed-d809-4408-af84-a2020d07c59c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.204485 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3285eed-d809-4408-af84-a2020d07c59c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.666075 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hx799" event={"ID":"b2ae6c32-20b1-4c6b-872d-8c72c5a28af6","Type":"ContainerDied","Data":"70cd9772e346201ac41ad6b6c3f6af23c2c03d962ffd0c52616c57af3166745f"} Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.667458 4735 scope.go:117] "RemoveContainer" containerID="9b6258e8be5cb301106d12900e8541b471227238253b0560400711cd77b8c8db" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.666098 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hx799" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.668520 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6nvtz" event={"ID":"e3285eed-d809-4408-af84-a2020d07c59c","Type":"ContainerDied","Data":"380ecc761a26637454b99755dd2aa275734207a55c18a6107462cb94f4756063"} Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.668734 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6nvtz" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.670384 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" event={"ID":"94467536-0aa2-426e-a14a-bb05c8afd56c","Type":"ContainerStarted","Data":"2d3ea1a8cba71d67024a1c06c1f62cf1e72fe3dea54f8527d32c0debab8cb6a9"} Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.670420 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" event={"ID":"94467536-0aa2-426e-a14a-bb05c8afd56c","Type":"ContainerStarted","Data":"d477cfee449b356c725768740a07c4fad09679b83a16f62a8c6a80b7d34487f3"} Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.670842 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.674336 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x7j8p" event={"ID":"66aa965b-9b48-44f3-9d53-e1b0ff1829dd","Type":"ContainerDied","Data":"f7314ffd4ffc1aa1db6927c7916ce18d03387ed815fba716dc042108832c29bf"} Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.674418 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x7j8p" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.675244 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.678613 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" event={"ID":"b79ac2f9-c8bc-4893-ace2-ca598d77ff52","Type":"ContainerDied","Data":"da1327489f1e1b175d745a062350fa6226ded86cd48081d7806da155d1c51ecd"} Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.678650 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.681212 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h9dbw" event={"ID":"ac706b67-93f2-4ccd-a0e8-7ebb309bc905","Type":"ContainerDied","Data":"6fdcc09b20be807d3f66e215ae8d9ce3c47b399fe46a570c77c80da146b757c9"} Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.681447 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h9dbw" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.683994 4735 scope.go:117] "RemoveContainer" containerID="adbd8bf87bb89f361f4685aa00683088a4de0103e85750b34b4b44f62086dde7" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.692964 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-twfwh" podStartSLOduration=1.6929503270000001 podStartE2EDuration="1.692950327s" podCreationTimestamp="2025-10-01 10:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:21:12.6925378 +0000 UTC m=+231.385359062" watchObservedRunningTime="2025-10-01 10:21:12.692950327 +0000 UTC m=+231.385771589" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.709835 4735 scope.go:117] "RemoveContainer" containerID="adfe18d10a77232c579fe4bfc1506266a5d93c1c31cffc76cfca8c13d6cbf62d" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.729053 4735 scope.go:117] "RemoveContainer" containerID="e71b5c3094f25d9617018c313c229a930525f767b52525775e0883b66b4001e8" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.732649 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hx799"] Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.736138 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hx799"] Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.752385 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x7j8p"] Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.756067 4735 scope.go:117] "RemoveContainer" containerID="3086162560d419d1388efc582c9724accd9d630bfdf2eb1f05dd151c37934b73" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.757664 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x7j8p"] Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.761533 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6nvtz"] Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.768137 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6nvtz"] Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.773872 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h9dbw"] Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.777392 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cqpz2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.777465 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cqpz2" podUID="b79ac2f9-c8bc-4893-ace2-ca598d77ff52" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.783030 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h9dbw"] Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.783238 4735 scope.go:117] "RemoveContainer" containerID="edf6db42a739636c3e4a36b15663ff296a3c322cd059ad67f23c58bc8f0cc4c6" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.787789 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqpz2"] Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.792894 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cqpz2"] Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.802110 4735 scope.go:117] "RemoveContainer" containerID="e500d840d80ab66ec478a2be47c269a1c76aa72b4f97a6ced0fcaf2a61a8b0cb" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.821149 4735 scope.go:117] "RemoveContainer" containerID="205cdf8c9e62292efcdf0e729924db8bce490dc64e2317c9ceeced1cee58bbe5" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.834351 4735 scope.go:117] "RemoveContainer" containerID="c5cf2f19bb625e66680192288845ec7e9855743f7955817ea5205ea8c9f03f59" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.845668 4735 scope.go:117] "RemoveContainer" containerID="8e45bfb355a4b3adea101c5d6b0e92f94f799dea4d3be26d4ef2eed8c62dc1c9" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.857632 4735 scope.go:117] "RemoveContainer" containerID="676b1cfd9def1384b48b7ef59e38915e16acce2d15294275b1cf62a7bb5ad9cc" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.870216 4735 scope.go:117] "RemoveContainer" containerID="bbd329908b04eb22be53d2f9280258faf04450a4da788192f8568a736a8f3a6c" Oct 01 10:21:12 crc kubenswrapper[4735]: I1001 10:21:12.883714 4735 scope.go:117] "RemoveContainer" containerID="c37d4aff5240b0d74e8f163e0ba7a656bca8fe42d2da0905799c2053c43e3684" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630212 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dshz9"] Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630623 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630634 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630643 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79ac2f9-c8bc-4893-ace2-ca598d77ff52" containerName="marketplace-operator" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630649 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79ac2f9-c8bc-4893-ace2-ca598d77ff52" containerName="marketplace-operator" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630656 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" containerName="extract-utilities" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630662 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" containerName="extract-utilities" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630671 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" containerName="extract-utilities" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630677 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" containerName="extract-utilities" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630685 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3285eed-d809-4408-af84-a2020d07c59c" containerName="extract-content" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630690 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3285eed-d809-4408-af84-a2020d07c59c" containerName="extract-content" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630702 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" containerName="extract-utilities" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630708 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" containerName="extract-utilities" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630715 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" containerName="extract-content" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630720 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" containerName="extract-content" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630729 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3285eed-d809-4408-af84-a2020d07c59c" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630734 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3285eed-d809-4408-af84-a2020d07c59c" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630741 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" containerName="extract-content" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630746 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" containerName="extract-content" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630755 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" containerName="extract-content" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630762 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" containerName="extract-content" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630769 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3285eed-d809-4408-af84-a2020d07c59c" containerName="extract-utilities" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630774 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3285eed-d809-4408-af84-a2020d07c59c" containerName="extract-utilities" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630783 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630788 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: E1001 10:21:13.630797 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630803 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630880 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630891 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3285eed-d809-4408-af84-a2020d07c59c" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630900 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630908 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b79ac2f9-c8bc-4893-ace2-ca598d77ff52" containerName="marketplace-operator" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.630915 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" containerName="registry-server" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.631554 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.636409 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.651641 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshz9"] Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.721478 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0554f347-c661-432f-ad8f-e64550027f55-utilities\") pod \"redhat-marketplace-dshz9\" (UID: \"0554f347-c661-432f-ad8f-e64550027f55\") " pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.721570 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl2zh\" (UniqueName: \"kubernetes.io/projected/0554f347-c661-432f-ad8f-e64550027f55-kube-api-access-zl2zh\") pod \"redhat-marketplace-dshz9\" (UID: \"0554f347-c661-432f-ad8f-e64550027f55\") " pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.721620 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0554f347-c661-432f-ad8f-e64550027f55-catalog-content\") pod \"redhat-marketplace-dshz9\" (UID: \"0554f347-c661-432f-ad8f-e64550027f55\") " pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.823089 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0554f347-c661-432f-ad8f-e64550027f55-utilities\") pod \"redhat-marketplace-dshz9\" (UID: \"0554f347-c661-432f-ad8f-e64550027f55\") " pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.823147 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl2zh\" (UniqueName: \"kubernetes.io/projected/0554f347-c661-432f-ad8f-e64550027f55-kube-api-access-zl2zh\") pod \"redhat-marketplace-dshz9\" (UID: \"0554f347-c661-432f-ad8f-e64550027f55\") " pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.823222 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0554f347-c661-432f-ad8f-e64550027f55-catalog-content\") pod \"redhat-marketplace-dshz9\" (UID: \"0554f347-c661-432f-ad8f-e64550027f55\") " pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.823638 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0554f347-c661-432f-ad8f-e64550027f55-utilities\") pod \"redhat-marketplace-dshz9\" (UID: \"0554f347-c661-432f-ad8f-e64550027f55\") " pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.823684 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0554f347-c661-432f-ad8f-e64550027f55-catalog-content\") pod \"redhat-marketplace-dshz9\" (UID: \"0554f347-c661-432f-ad8f-e64550027f55\") " pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.834175 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h4kg5"] Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.835214 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.839656 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.846647 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4kg5"] Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.850650 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl2zh\" (UniqueName: \"kubernetes.io/projected/0554f347-c661-432f-ad8f-e64550027f55-kube-api-access-zl2zh\") pod \"redhat-marketplace-dshz9\" (UID: \"0554f347-c661-432f-ad8f-e64550027f55\") " pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.903462 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66aa965b-9b48-44f3-9d53-e1b0ff1829dd" path="/var/lib/kubelet/pods/66aa965b-9b48-44f3-9d53-e1b0ff1829dd/volumes" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.904164 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac706b67-93f2-4ccd-a0e8-7ebb309bc905" path="/var/lib/kubelet/pods/ac706b67-93f2-4ccd-a0e8-7ebb309bc905/volumes" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.904724 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ae6c32-20b1-4c6b-872d-8c72c5a28af6" path="/var/lib/kubelet/pods/b2ae6c32-20b1-4c6b-872d-8c72c5a28af6/volumes" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.905715 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b79ac2f9-c8bc-4893-ace2-ca598d77ff52" path="/var/lib/kubelet/pods/b79ac2f9-c8bc-4893-ace2-ca598d77ff52/volumes" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.906153 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3285eed-d809-4408-af84-a2020d07c59c" path="/var/lib/kubelet/pods/e3285eed-d809-4408-af84-a2020d07c59c/volumes" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.923904 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75rqg\" (UniqueName: \"kubernetes.io/projected/921e36dc-85a8-400f-b33f-c5172a57d95b-kube-api-access-75rqg\") pod \"redhat-operators-h4kg5\" (UID: \"921e36dc-85a8-400f-b33f-c5172a57d95b\") " pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.923967 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/921e36dc-85a8-400f-b33f-c5172a57d95b-utilities\") pod \"redhat-operators-h4kg5\" (UID: \"921e36dc-85a8-400f-b33f-c5172a57d95b\") " pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.924160 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/921e36dc-85a8-400f-b33f-c5172a57d95b-catalog-content\") pod \"redhat-operators-h4kg5\" (UID: \"921e36dc-85a8-400f-b33f-c5172a57d95b\") " pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:13 crc kubenswrapper[4735]: I1001 10:21:13.949783 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.025082 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/921e36dc-85a8-400f-b33f-c5172a57d95b-utilities\") pod \"redhat-operators-h4kg5\" (UID: \"921e36dc-85a8-400f-b33f-c5172a57d95b\") " pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.025144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/921e36dc-85a8-400f-b33f-c5172a57d95b-catalog-content\") pod \"redhat-operators-h4kg5\" (UID: \"921e36dc-85a8-400f-b33f-c5172a57d95b\") " pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.025194 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75rqg\" (UniqueName: \"kubernetes.io/projected/921e36dc-85a8-400f-b33f-c5172a57d95b-kube-api-access-75rqg\") pod \"redhat-operators-h4kg5\" (UID: \"921e36dc-85a8-400f-b33f-c5172a57d95b\") " pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.025987 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/921e36dc-85a8-400f-b33f-c5172a57d95b-utilities\") pod \"redhat-operators-h4kg5\" (UID: \"921e36dc-85a8-400f-b33f-c5172a57d95b\") " pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.026901 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/921e36dc-85a8-400f-b33f-c5172a57d95b-catalog-content\") pod \"redhat-operators-h4kg5\" (UID: \"921e36dc-85a8-400f-b33f-c5172a57d95b\") " pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.050127 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75rqg\" (UniqueName: \"kubernetes.io/projected/921e36dc-85a8-400f-b33f-c5172a57d95b-kube-api-access-75rqg\") pod \"redhat-operators-h4kg5\" (UID: \"921e36dc-85a8-400f-b33f-c5172a57d95b\") " pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.145604 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshz9"] Oct 01 10:21:14 crc kubenswrapper[4735]: W1001 10:21:14.154062 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0554f347_c661_432f_ad8f_e64550027f55.slice/crio-20873092a6436f5ef661458ba9ffa8c158a4f73b39d5ed2f38c94ef417ca87a8 WatchSource:0}: Error finding container 20873092a6436f5ef661458ba9ffa8c158a4f73b39d5ed2f38c94ef417ca87a8: Status 404 returned error can't find the container with id 20873092a6436f5ef661458ba9ffa8c158a4f73b39d5ed2f38c94ef417ca87a8 Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.155844 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.347428 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4kg5"] Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.697152 4735 generic.go:334] "Generic (PLEG): container finished" podID="0554f347-c661-432f-ad8f-e64550027f55" containerID="305ad158155d822c98aad443a323df23a091a2ffbee502c58ea51f4014145c20" exitCode=0 Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.697263 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshz9" event={"ID":"0554f347-c661-432f-ad8f-e64550027f55","Type":"ContainerDied","Data":"305ad158155d822c98aad443a323df23a091a2ffbee502c58ea51f4014145c20"} Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.697487 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshz9" event={"ID":"0554f347-c661-432f-ad8f-e64550027f55","Type":"ContainerStarted","Data":"20873092a6436f5ef661458ba9ffa8c158a4f73b39d5ed2f38c94ef417ca87a8"} Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.699281 4735 generic.go:334] "Generic (PLEG): container finished" podID="921e36dc-85a8-400f-b33f-c5172a57d95b" containerID="959c88cb2539be823d9e35f0619b9dcbf9160794a12feab165cf875b9a0231c7" exitCode=0 Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.699324 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4kg5" event={"ID":"921e36dc-85a8-400f-b33f-c5172a57d95b","Type":"ContainerDied","Data":"959c88cb2539be823d9e35f0619b9dcbf9160794a12feab165cf875b9a0231c7"} Oct 01 10:21:14 crc kubenswrapper[4735]: I1001 10:21:14.699364 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4kg5" event={"ID":"921e36dc-85a8-400f-b33f-c5172a57d95b","Type":"ContainerStarted","Data":"a99ed75375342e67acd8e19d86ea1edf049764c3e15dfedfa04288ad0e425857"} Oct 01 10:21:15 crc kubenswrapper[4735]: I1001 10:21:15.705879 4735 generic.go:334] "Generic (PLEG): container finished" podID="0554f347-c661-432f-ad8f-e64550027f55" containerID="973a5d72ff8334e7af2d47c1222733f025230125ff379f7716d1c0be94de7e7e" exitCode=0 Oct 01 10:21:15 crc kubenswrapper[4735]: I1001 10:21:15.705994 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshz9" event={"ID":"0554f347-c661-432f-ad8f-e64550027f55","Type":"ContainerDied","Data":"973a5d72ff8334e7af2d47c1222733f025230125ff379f7716d1c0be94de7e7e"} Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.035365 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pxqpm"] Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.036988 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.039749 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.051155 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pxqpm"] Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.154522 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd664456-b523-413a-91a4-04d55c466f57-catalog-content\") pod \"certified-operators-pxqpm\" (UID: \"cd664456-b523-413a-91a4-04d55c466f57\") " pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.154574 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c2qz\" (UniqueName: \"kubernetes.io/projected/cd664456-b523-413a-91a4-04d55c466f57-kube-api-access-6c2qz\") pod \"certified-operators-pxqpm\" (UID: \"cd664456-b523-413a-91a4-04d55c466f57\") " pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.154600 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd664456-b523-413a-91a4-04d55c466f57-utilities\") pod \"certified-operators-pxqpm\" (UID: \"cd664456-b523-413a-91a4-04d55c466f57\") " pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.233167 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lmv8p"] Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.234135 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.239445 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.253897 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lmv8p"] Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.256261 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c2qz\" (UniqueName: \"kubernetes.io/projected/cd664456-b523-413a-91a4-04d55c466f57-kube-api-access-6c2qz\") pod \"certified-operators-pxqpm\" (UID: \"cd664456-b523-413a-91a4-04d55c466f57\") " pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.256365 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd664456-b523-413a-91a4-04d55c466f57-utilities\") pod \"certified-operators-pxqpm\" (UID: \"cd664456-b523-413a-91a4-04d55c466f57\") " pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.256450 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd664456-b523-413a-91a4-04d55c466f57-catalog-content\") pod \"certified-operators-pxqpm\" (UID: \"cd664456-b523-413a-91a4-04d55c466f57\") " pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.257002 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd664456-b523-413a-91a4-04d55c466f57-catalog-content\") pod \"certified-operators-pxqpm\" (UID: \"cd664456-b523-413a-91a4-04d55c466f57\") " pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.257241 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd664456-b523-413a-91a4-04d55c466f57-utilities\") pod \"certified-operators-pxqpm\" (UID: \"cd664456-b523-413a-91a4-04d55c466f57\") " pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.282769 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c2qz\" (UniqueName: \"kubernetes.io/projected/cd664456-b523-413a-91a4-04d55c466f57-kube-api-access-6c2qz\") pod \"certified-operators-pxqpm\" (UID: \"cd664456-b523-413a-91a4-04d55c466f57\") " pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.357595 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f770d3-02c9-47fb-b650-d515b1c96ea2-utilities\") pod \"community-operators-lmv8p\" (UID: \"d0f770d3-02c9-47fb-b650-d515b1c96ea2\") " pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.358041 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv8l6\" (UniqueName: \"kubernetes.io/projected/d0f770d3-02c9-47fb-b650-d515b1c96ea2-kube-api-access-hv8l6\") pod \"community-operators-lmv8p\" (UID: \"d0f770d3-02c9-47fb-b650-d515b1c96ea2\") " pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.358081 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f770d3-02c9-47fb-b650-d515b1c96ea2-catalog-content\") pod \"community-operators-lmv8p\" (UID: \"d0f770d3-02c9-47fb-b650-d515b1c96ea2\") " pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.372464 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.461105 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv8l6\" (UniqueName: \"kubernetes.io/projected/d0f770d3-02c9-47fb-b650-d515b1c96ea2-kube-api-access-hv8l6\") pod \"community-operators-lmv8p\" (UID: \"d0f770d3-02c9-47fb-b650-d515b1c96ea2\") " pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.461166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f770d3-02c9-47fb-b650-d515b1c96ea2-catalog-content\") pod \"community-operators-lmv8p\" (UID: \"d0f770d3-02c9-47fb-b650-d515b1c96ea2\") " pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.461391 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f770d3-02c9-47fb-b650-d515b1c96ea2-utilities\") pod \"community-operators-lmv8p\" (UID: \"d0f770d3-02c9-47fb-b650-d515b1c96ea2\") " pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.462064 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f770d3-02c9-47fb-b650-d515b1c96ea2-utilities\") pod \"community-operators-lmv8p\" (UID: \"d0f770d3-02c9-47fb-b650-d515b1c96ea2\") " pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.462634 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f770d3-02c9-47fb-b650-d515b1c96ea2-catalog-content\") pod \"community-operators-lmv8p\" (UID: \"d0f770d3-02c9-47fb-b650-d515b1c96ea2\") " pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.481352 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv8l6\" (UniqueName: \"kubernetes.io/projected/d0f770d3-02c9-47fb-b650-d515b1c96ea2-kube-api-access-hv8l6\") pod \"community-operators-lmv8p\" (UID: \"d0f770d3-02c9-47fb-b650-d515b1c96ea2\") " pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.625596 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.714717 4735 generic.go:334] "Generic (PLEG): container finished" podID="921e36dc-85a8-400f-b33f-c5172a57d95b" containerID="29832bf2b2adf75cbdffe95a62a74855b094501c742e109dc6a0b4ea137fdfa2" exitCode=0 Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.714792 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4kg5" event={"ID":"921e36dc-85a8-400f-b33f-c5172a57d95b","Type":"ContainerDied","Data":"29832bf2b2adf75cbdffe95a62a74855b094501c742e109dc6a0b4ea137fdfa2"} Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.721124 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshz9" event={"ID":"0554f347-c661-432f-ad8f-e64550027f55","Type":"ContainerStarted","Data":"631c5028cb6333e060c471db11d2e347f941b34c424101f0ce71860a8ebf38df"} Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.757930 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pxqpm"] Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.760047 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dshz9" podStartSLOduration=2.309953317 podStartE2EDuration="3.760034494s" podCreationTimestamp="2025-10-01 10:21:13 +0000 UTC" firstStartedPulling="2025-10-01 10:21:14.698612494 +0000 UTC m=+233.391433756" lastFinishedPulling="2025-10-01 10:21:16.148693671 +0000 UTC m=+234.841514933" observedRunningTime="2025-10-01 10:21:16.753911102 +0000 UTC m=+235.446732384" watchObservedRunningTime="2025-10-01 10:21:16.760034494 +0000 UTC m=+235.452855766" Oct 01 10:21:16 crc kubenswrapper[4735]: W1001 10:21:16.760709 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd664456_b523_413a_91a4_04d55c466f57.slice/crio-31332b1dfdd273bc236474086ae1d2e243d71cd3bb452544d5be60762a120e41 WatchSource:0}: Error finding container 31332b1dfdd273bc236474086ae1d2e243d71cd3bb452544d5be60762a120e41: Status 404 returned error can't find the container with id 31332b1dfdd273bc236474086ae1d2e243d71cd3bb452544d5be60762a120e41 Oct 01 10:21:16 crc kubenswrapper[4735]: I1001 10:21:16.800478 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lmv8p"] Oct 01 10:21:16 crc kubenswrapper[4735]: W1001 10:21:16.805630 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0f770d3_02c9_47fb_b650_d515b1c96ea2.slice/crio-d7605484737268aafc9ca17e86f790bf303ab74dd7dab7ef49615c6197916093 WatchSource:0}: Error finding container d7605484737268aafc9ca17e86f790bf303ab74dd7dab7ef49615c6197916093: Status 404 returned error can't find the container with id d7605484737268aafc9ca17e86f790bf303ab74dd7dab7ef49615c6197916093 Oct 01 10:21:17 crc kubenswrapper[4735]: I1001 10:21:17.728938 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4kg5" event={"ID":"921e36dc-85a8-400f-b33f-c5172a57d95b","Type":"ContainerStarted","Data":"3b818d0562607e33a464311faca672f717becb68e02c95fc5a5f538b278c86b9"} Oct 01 10:21:17 crc kubenswrapper[4735]: I1001 10:21:17.731342 4735 generic.go:334] "Generic (PLEG): container finished" podID="cd664456-b523-413a-91a4-04d55c466f57" containerID="7e5616b11ed3e87d4fadcc31bb5e609131549805a3e8de6baaa69a996e48d0f8" exitCode=0 Oct 01 10:21:17 crc kubenswrapper[4735]: I1001 10:21:17.731407 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pxqpm" event={"ID":"cd664456-b523-413a-91a4-04d55c466f57","Type":"ContainerDied","Data":"7e5616b11ed3e87d4fadcc31bb5e609131549805a3e8de6baaa69a996e48d0f8"} Oct 01 10:21:17 crc kubenswrapper[4735]: I1001 10:21:17.731430 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pxqpm" event={"ID":"cd664456-b523-413a-91a4-04d55c466f57","Type":"ContainerStarted","Data":"31332b1dfdd273bc236474086ae1d2e243d71cd3bb452544d5be60762a120e41"} Oct 01 10:21:17 crc kubenswrapper[4735]: I1001 10:21:17.732979 4735 generic.go:334] "Generic (PLEG): container finished" podID="d0f770d3-02c9-47fb-b650-d515b1c96ea2" containerID="6f66b7d4eb0a33e8d79b6884fc967202fb1d9897f1249c3e7593f4bececc0563" exitCode=0 Oct 01 10:21:17 crc kubenswrapper[4735]: I1001 10:21:17.733015 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmv8p" event={"ID":"d0f770d3-02c9-47fb-b650-d515b1c96ea2","Type":"ContainerDied","Data":"6f66b7d4eb0a33e8d79b6884fc967202fb1d9897f1249c3e7593f4bececc0563"} Oct 01 10:21:17 crc kubenswrapper[4735]: I1001 10:21:17.733050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmv8p" event={"ID":"d0f770d3-02c9-47fb-b650-d515b1c96ea2","Type":"ContainerStarted","Data":"d7605484737268aafc9ca17e86f790bf303ab74dd7dab7ef49615c6197916093"} Oct 01 10:21:17 crc kubenswrapper[4735]: I1001 10:21:17.766719 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h4kg5" podStartSLOduration=1.977581688 podStartE2EDuration="4.766702987s" podCreationTimestamp="2025-10-01 10:21:13 +0000 UTC" firstStartedPulling="2025-10-01 10:21:14.700511326 +0000 UTC m=+233.393332588" lastFinishedPulling="2025-10-01 10:21:17.489632625 +0000 UTC m=+236.182453887" observedRunningTime="2025-10-01 10:21:17.764691732 +0000 UTC m=+236.457512994" watchObservedRunningTime="2025-10-01 10:21:17.766702987 +0000 UTC m=+236.459524249" Oct 01 10:21:18 crc kubenswrapper[4735]: I1001 10:21:18.738519 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pxqpm" event={"ID":"cd664456-b523-413a-91a4-04d55c466f57","Type":"ContainerStarted","Data":"d58aabab483dcb0a0e937448f091b1d947a90c10cf0a37cee9b9c81fd9f0f3ea"} Oct 01 10:21:18 crc kubenswrapper[4735]: I1001 10:21:18.744380 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmv8p" event={"ID":"d0f770d3-02c9-47fb-b650-d515b1c96ea2","Type":"ContainerStarted","Data":"89ec473eaf50206c1cd670a2cd61b1e10d0cdea70fbb59e5f31a384a0e8c629d"} Oct 01 10:21:19 crc kubenswrapper[4735]: I1001 10:21:19.751206 4735 generic.go:334] "Generic (PLEG): container finished" podID="cd664456-b523-413a-91a4-04d55c466f57" containerID="d58aabab483dcb0a0e937448f091b1d947a90c10cf0a37cee9b9c81fd9f0f3ea" exitCode=0 Oct 01 10:21:19 crc kubenswrapper[4735]: I1001 10:21:19.751313 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pxqpm" event={"ID":"cd664456-b523-413a-91a4-04d55c466f57","Type":"ContainerDied","Data":"d58aabab483dcb0a0e937448f091b1d947a90c10cf0a37cee9b9c81fd9f0f3ea"} Oct 01 10:21:19 crc kubenswrapper[4735]: I1001 10:21:19.754027 4735 generic.go:334] "Generic (PLEG): container finished" podID="d0f770d3-02c9-47fb-b650-d515b1c96ea2" containerID="89ec473eaf50206c1cd670a2cd61b1e10d0cdea70fbb59e5f31a384a0e8c629d" exitCode=0 Oct 01 10:21:19 crc kubenswrapper[4735]: I1001 10:21:19.754056 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmv8p" event={"ID":"d0f770d3-02c9-47fb-b650-d515b1c96ea2","Type":"ContainerDied","Data":"89ec473eaf50206c1cd670a2cd61b1e10d0cdea70fbb59e5f31a384a0e8c629d"} Oct 01 10:21:20 crc kubenswrapper[4735]: I1001 10:21:20.769552 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmv8p" event={"ID":"d0f770d3-02c9-47fb-b650-d515b1c96ea2","Type":"ContainerStarted","Data":"27f9b2c842d1d190cfc7143e8806164ac3c5afa73383d62e769e7c3939c0d345"} Oct 01 10:21:20 crc kubenswrapper[4735]: I1001 10:21:20.797929 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lmv8p" podStartSLOduration=3.363098545 podStartE2EDuration="4.797907274s" podCreationTimestamp="2025-10-01 10:21:16 +0000 UTC" firstStartedPulling="2025-10-01 10:21:17.733790171 +0000 UTC m=+236.426611433" lastFinishedPulling="2025-10-01 10:21:19.1685989 +0000 UTC m=+237.861420162" observedRunningTime="2025-10-01 10:21:20.796023623 +0000 UTC m=+239.488844885" watchObservedRunningTime="2025-10-01 10:21:20.797907274 +0000 UTC m=+239.490728526" Oct 01 10:21:21 crc kubenswrapper[4735]: I1001 10:21:21.776228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pxqpm" event={"ID":"cd664456-b523-413a-91a4-04d55c466f57","Type":"ContainerStarted","Data":"586ecbb850156668f8755c5896b277a2055f1ba96bf6fa89225a06337b8f167b"} Oct 01 10:21:21 crc kubenswrapper[4735]: I1001 10:21:21.795168 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pxqpm" podStartSLOduration=2.9132354400000002 podStartE2EDuration="5.795149492s" podCreationTimestamp="2025-10-01 10:21:16 +0000 UTC" firstStartedPulling="2025-10-01 10:21:17.732211792 +0000 UTC m=+236.425033044" lastFinishedPulling="2025-10-01 10:21:20.614125824 +0000 UTC m=+239.306947096" observedRunningTime="2025-10-01 10:21:21.792070855 +0000 UTC m=+240.484892117" watchObservedRunningTime="2025-10-01 10:21:21.795149492 +0000 UTC m=+240.487970754" Oct 01 10:21:23 crc kubenswrapper[4735]: I1001 10:21:23.950563 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:23 crc kubenswrapper[4735]: I1001 10:21:23.951062 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:23 crc kubenswrapper[4735]: I1001 10:21:23.991085 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:24 crc kubenswrapper[4735]: I1001 10:21:24.156663 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:24 crc kubenswrapper[4735]: I1001 10:21:24.156722 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:24 crc kubenswrapper[4735]: I1001 10:21:24.192227 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:24 crc kubenswrapper[4735]: I1001 10:21:24.829254 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h4kg5" Oct 01 10:21:24 crc kubenswrapper[4735]: I1001 10:21:24.831034 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dshz9" Oct 01 10:21:26 crc kubenswrapper[4735]: I1001 10:21:26.373338 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:26 crc kubenswrapper[4735]: I1001 10:21:26.373667 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:26 crc kubenswrapper[4735]: I1001 10:21:26.414538 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:21:26 crc kubenswrapper[4735]: I1001 10:21:26.626095 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:26 crc kubenswrapper[4735]: I1001 10:21:26.626156 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:26 crc kubenswrapper[4735]: I1001 10:21:26.661008 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:26 crc kubenswrapper[4735]: I1001 10:21:26.835750 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lmv8p" Oct 01 10:21:26 crc kubenswrapper[4735]: I1001 10:21:26.838175 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pxqpm" Oct 01 10:23:05 crc kubenswrapper[4735]: I1001 10:23:05.485630 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:23:05 crc kubenswrapper[4735]: I1001 10:23:05.486230 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:23:35 crc kubenswrapper[4735]: I1001 10:23:35.485771 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:23:35 crc kubenswrapper[4735]: I1001 10:23:35.486441 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:24:05 crc kubenswrapper[4735]: I1001 10:24:05.485515 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:24:05 crc kubenswrapper[4735]: I1001 10:24:05.485943 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:24:05 crc kubenswrapper[4735]: I1001 10:24:05.485991 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:24:05 crc kubenswrapper[4735]: I1001 10:24:05.486522 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37cba53e290bdc38e83217d84214f3378c3c1355865e08cfd659f1334766bc2e"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 10:24:05 crc kubenswrapper[4735]: I1001 10:24:05.486590 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://37cba53e290bdc38e83217d84214f3378c3c1355865e08cfd659f1334766bc2e" gracePeriod=600 Oct 01 10:24:06 crc kubenswrapper[4735]: I1001 10:24:06.589717 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="37cba53e290bdc38e83217d84214f3378c3c1355865e08cfd659f1334766bc2e" exitCode=0 Oct 01 10:24:06 crc kubenswrapper[4735]: I1001 10:24:06.589800 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"37cba53e290bdc38e83217d84214f3378c3c1355865e08cfd659f1334766bc2e"} Oct 01 10:24:06 crc kubenswrapper[4735]: I1001 10:24:06.590347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"218e50335c2b017baada525f436dc7da1909a27a86b76e27c3f9d13a94f70329"} Oct 01 10:24:06 crc kubenswrapper[4735]: I1001 10:24:06.590376 4735 scope.go:117] "RemoveContainer" containerID="1ce42e6fa3ca415304e3e0d504a7fe7e8ddd52a32554baabf84253c7bb4d630c" Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.855860 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cc2d2"] Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.857615 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.866272 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cc2d2"] Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.978054 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhbd2\" (UniqueName: \"kubernetes.io/projected/332905e8-07e8-455f-82bb-cf310a59e123-kube-api-access-lhbd2\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.978190 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.978246 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/332905e8-07e8-455f-82bb-cf310a59e123-registry-tls\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.978275 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/332905e8-07e8-455f-82bb-cf310a59e123-registry-certificates\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.978319 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/332905e8-07e8-455f-82bb-cf310a59e123-bound-sa-token\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.978429 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/332905e8-07e8-455f-82bb-cf310a59e123-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.978457 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/332905e8-07e8-455f-82bb-cf310a59e123-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.978503 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/332905e8-07e8-455f-82bb-cf310a59e123-trusted-ca\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:39 crc kubenswrapper[4735]: I1001 10:24:39.998011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.079889 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/332905e8-07e8-455f-82bb-cf310a59e123-bound-sa-token\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.079935 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/332905e8-07e8-455f-82bb-cf310a59e123-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.079957 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/332905e8-07e8-455f-82bb-cf310a59e123-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.079997 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/332905e8-07e8-455f-82bb-cf310a59e123-trusted-ca\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.080044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhbd2\" (UniqueName: \"kubernetes.io/projected/332905e8-07e8-455f-82bb-cf310a59e123-kube-api-access-lhbd2\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.080073 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/332905e8-07e8-455f-82bb-cf310a59e123-registry-tls\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.080089 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/332905e8-07e8-455f-82bb-cf310a59e123-registry-certificates\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.080769 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/332905e8-07e8-455f-82bb-cf310a59e123-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.081561 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/332905e8-07e8-455f-82bb-cf310a59e123-registry-certificates\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.081729 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/332905e8-07e8-455f-82bb-cf310a59e123-trusted-ca\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.086454 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/332905e8-07e8-455f-82bb-cf310a59e123-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.086700 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/332905e8-07e8-455f-82bb-cf310a59e123-registry-tls\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.095556 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/332905e8-07e8-455f-82bb-cf310a59e123-bound-sa-token\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.096166 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhbd2\" (UniqueName: \"kubernetes.io/projected/332905e8-07e8-455f-82bb-cf310a59e123-kube-api-access-lhbd2\") pod \"image-registry-66df7c8f76-cc2d2\" (UID: \"332905e8-07e8-455f-82bb-cf310a59e123\") " pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.177792 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.375206 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cc2d2"] Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.783801 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" event={"ID":"332905e8-07e8-455f-82bb-cf310a59e123","Type":"ContainerStarted","Data":"4f626b43b48ca7d021c9c9cece3b80a6d40a0697e29f14de1ad174df4959d597"} Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.784112 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" event={"ID":"332905e8-07e8-455f-82bb-cf310a59e123","Type":"ContainerStarted","Data":"a046877bd34580c7c8d4f68346c08815c2687c88cf3411d1399a75552442881d"} Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.784136 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:24:40 crc kubenswrapper[4735]: I1001 10:24:40.804005 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" podStartSLOduration=1.8039829649999999 podStartE2EDuration="1.803982965s" podCreationTimestamp="2025-10-01 10:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:24:40.802625288 +0000 UTC m=+439.495446560" watchObservedRunningTime="2025-10-01 10:24:40.803982965 +0000 UTC m=+439.496804247" Oct 01 10:25:00 crc kubenswrapper[4735]: I1001 10:25:00.184059 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-cc2d2" Oct 01 10:25:00 crc kubenswrapper[4735]: I1001 10:25:00.239126 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-klpcq"] Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.280161 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" podUID="19fd9940-52eb-4a65-8e75-531a27563c1b" containerName="registry" containerID="cri-o://f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55" gracePeriod=30 Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.574691 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.668958 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-trusted-ca\") pod \"19fd9940-52eb-4a65-8e75-531a27563c1b\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.669631 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-bound-sa-token\") pod \"19fd9940-52eb-4a65-8e75-531a27563c1b\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.669740 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19fd9940-52eb-4a65-8e75-531a27563c1b-ca-trust-extracted\") pod \"19fd9940-52eb-4a65-8e75-531a27563c1b\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.669768 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19fd9940-52eb-4a65-8e75-531a27563c1b-installation-pull-secrets\") pod \"19fd9940-52eb-4a65-8e75-531a27563c1b\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.669871 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "19fd9940-52eb-4a65-8e75-531a27563c1b" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.669979 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"19fd9940-52eb-4a65-8e75-531a27563c1b\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.670029 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-certificates\") pod \"19fd9940-52eb-4a65-8e75-531a27563c1b\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.670075 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-tls\") pod \"19fd9940-52eb-4a65-8e75-531a27563c1b\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.670116 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd8vw\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-kube-api-access-wd8vw\") pod \"19fd9940-52eb-4a65-8e75-531a27563c1b\" (UID: \"19fd9940-52eb-4a65-8e75-531a27563c1b\") " Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.670423 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.671273 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "19fd9940-52eb-4a65-8e75-531a27563c1b" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.675466 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "19fd9940-52eb-4a65-8e75-531a27563c1b" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.675525 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fd9940-52eb-4a65-8e75-531a27563c1b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "19fd9940-52eb-4a65-8e75-531a27563c1b" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.675754 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-kube-api-access-wd8vw" (OuterVolumeSpecName: "kube-api-access-wd8vw") pod "19fd9940-52eb-4a65-8e75-531a27563c1b" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b"). InnerVolumeSpecName "kube-api-access-wd8vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.676193 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "19fd9940-52eb-4a65-8e75-531a27563c1b" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.681597 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "19fd9940-52eb-4a65-8e75-531a27563c1b" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.687879 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19fd9940-52eb-4a65-8e75-531a27563c1b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "19fd9940-52eb-4a65-8e75-531a27563c1b" (UID: "19fd9940-52eb-4a65-8e75-531a27563c1b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.772037 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.772070 4735 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19fd9940-52eb-4a65-8e75-531a27563c1b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.772082 4735 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19fd9940-52eb-4a65-8e75-531a27563c1b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.772097 4735 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.772110 4735 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:25:25 crc kubenswrapper[4735]: I1001 10:25:25.772121 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd8vw\" (UniqueName: \"kubernetes.io/projected/19fd9940-52eb-4a65-8e75-531a27563c1b-kube-api-access-wd8vw\") on node \"crc\" DevicePath \"\"" Oct 01 10:25:26 crc kubenswrapper[4735]: I1001 10:25:26.022859 4735 generic.go:334] "Generic (PLEG): container finished" podID="19fd9940-52eb-4a65-8e75-531a27563c1b" containerID="f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55" exitCode=0 Oct 01 10:25:26 crc kubenswrapper[4735]: I1001 10:25:26.022913 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" event={"ID":"19fd9940-52eb-4a65-8e75-531a27563c1b","Type":"ContainerDied","Data":"f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55"} Oct 01 10:25:26 crc kubenswrapper[4735]: I1001 10:25:26.022957 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" event={"ID":"19fd9940-52eb-4a65-8e75-531a27563c1b","Type":"ContainerDied","Data":"c757aab0c46ee644eb24762a7cdc62f339e491b5a51f297d409280012202302f"} Oct 01 10:25:26 crc kubenswrapper[4735]: I1001 10:25:26.022980 4735 scope.go:117] "RemoveContainer" containerID="f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55" Oct 01 10:25:26 crc kubenswrapper[4735]: I1001 10:25:26.022976 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-klpcq" Oct 01 10:25:26 crc kubenswrapper[4735]: I1001 10:25:26.040189 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-klpcq"] Oct 01 10:25:26 crc kubenswrapper[4735]: I1001 10:25:26.043247 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-klpcq"] Oct 01 10:25:26 crc kubenswrapper[4735]: I1001 10:25:26.044149 4735 scope.go:117] "RemoveContainer" containerID="f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55" Oct 01 10:25:26 crc kubenswrapper[4735]: E1001 10:25:26.044620 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55\": container with ID starting with f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55 not found: ID does not exist" containerID="f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55" Oct 01 10:25:26 crc kubenswrapper[4735]: I1001 10:25:26.044669 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55"} err="failed to get container status \"f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55\": rpc error: code = NotFound desc = could not find container \"f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55\": container with ID starting with f26265030764482850917e667e3c11a0c514d98b6f27f50c7d7562ee62019e55 not found: ID does not exist" Oct 01 10:25:27 crc kubenswrapper[4735]: I1001 10:25:27.913258 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fd9940-52eb-4a65-8e75-531a27563c1b" path="/var/lib/kubelet/pods/19fd9940-52eb-4a65-8e75-531a27563c1b/volumes" Oct 01 10:26:05 crc kubenswrapper[4735]: I1001 10:26:05.485759 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:26:05 crc kubenswrapper[4735]: I1001 10:26:05.486351 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.776086 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mvxdt"] Oct 01 10:26:27 crc kubenswrapper[4735]: E1001 10:26:27.776964 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fd9940-52eb-4a65-8e75-531a27563c1b" containerName="registry" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.776983 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fd9940-52eb-4a65-8e75-531a27563c1b" containerName="registry" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.777141 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fd9940-52eb-4a65-8e75-531a27563c1b" containerName="registry" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.777653 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mvxdt" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.788978 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-lp622" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.789288 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.789710 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.790910 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mvxdt"] Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.793528 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-srpsm"] Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.794272 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-srpsm" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.802967 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ngtlh" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.804047 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-d2xk7"] Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.804718 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.812606 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-srpsm"] Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.814772 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-mf7lv" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.835886 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-d2xk7"] Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.853152 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxvsh\" (UniqueName: \"kubernetes.io/projected/a4f1a7ac-f922-4a82-8675-c87e0921512f-kube-api-access-lxvsh\") pod \"cert-manager-webhook-5655c58dd6-d2xk7\" (UID: \"a4f1a7ac-f922-4a82-8675-c87e0921512f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.853192 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llqcg\" (UniqueName: \"kubernetes.io/projected/63d28348-4347-431a-97e8-3526e9f66a68-kube-api-access-llqcg\") pod \"cert-manager-cainjector-7f985d654d-mvxdt\" (UID: \"63d28348-4347-431a-97e8-3526e9f66a68\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mvxdt" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.853224 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz6pw\" (UniqueName: \"kubernetes.io/projected/4a6804d5-21c5-4d3d-9504-d769df881c52-kube-api-access-nz6pw\") pod \"cert-manager-5b446d88c5-srpsm\" (UID: \"4a6804d5-21c5-4d3d-9504-d769df881c52\") " pod="cert-manager/cert-manager-5b446d88c5-srpsm" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.954130 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxvsh\" (UniqueName: \"kubernetes.io/projected/a4f1a7ac-f922-4a82-8675-c87e0921512f-kube-api-access-lxvsh\") pod \"cert-manager-webhook-5655c58dd6-d2xk7\" (UID: \"a4f1a7ac-f922-4a82-8675-c87e0921512f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.954188 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llqcg\" (UniqueName: \"kubernetes.io/projected/63d28348-4347-431a-97e8-3526e9f66a68-kube-api-access-llqcg\") pod \"cert-manager-cainjector-7f985d654d-mvxdt\" (UID: \"63d28348-4347-431a-97e8-3526e9f66a68\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mvxdt" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.954243 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz6pw\" (UniqueName: \"kubernetes.io/projected/4a6804d5-21c5-4d3d-9504-d769df881c52-kube-api-access-nz6pw\") pod \"cert-manager-5b446d88c5-srpsm\" (UID: \"4a6804d5-21c5-4d3d-9504-d769df881c52\") " pod="cert-manager/cert-manager-5b446d88c5-srpsm" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.972297 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxvsh\" (UniqueName: \"kubernetes.io/projected/a4f1a7ac-f922-4a82-8675-c87e0921512f-kube-api-access-lxvsh\") pod \"cert-manager-webhook-5655c58dd6-d2xk7\" (UID: \"a4f1a7ac-f922-4a82-8675-c87e0921512f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.972309 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llqcg\" (UniqueName: \"kubernetes.io/projected/63d28348-4347-431a-97e8-3526e9f66a68-kube-api-access-llqcg\") pod \"cert-manager-cainjector-7f985d654d-mvxdt\" (UID: \"63d28348-4347-431a-97e8-3526e9f66a68\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mvxdt" Oct 01 10:26:27 crc kubenswrapper[4735]: I1001 10:26:27.972592 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz6pw\" (UniqueName: \"kubernetes.io/projected/4a6804d5-21c5-4d3d-9504-d769df881c52-kube-api-access-nz6pw\") pod \"cert-manager-5b446d88c5-srpsm\" (UID: \"4a6804d5-21c5-4d3d-9504-d769df881c52\") " pod="cert-manager/cert-manager-5b446d88c5-srpsm" Oct 01 10:26:28 crc kubenswrapper[4735]: I1001 10:26:28.109148 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mvxdt" Oct 01 10:26:28 crc kubenswrapper[4735]: I1001 10:26:28.122566 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-srpsm" Oct 01 10:26:28 crc kubenswrapper[4735]: I1001 10:26:28.131138 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" Oct 01 10:26:28 crc kubenswrapper[4735]: I1001 10:26:28.284812 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mvxdt"] Oct 01 10:26:28 crc kubenswrapper[4735]: I1001 10:26:28.292354 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 10:26:28 crc kubenswrapper[4735]: I1001 10:26:28.350939 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mvxdt" event={"ID":"63d28348-4347-431a-97e8-3526e9f66a68","Type":"ContainerStarted","Data":"7503214daf561d7685526292435682d67bfec7d595d49ac16af3543cc5e0690d"} Oct 01 10:26:28 crc kubenswrapper[4735]: I1001 10:26:28.538253 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-d2xk7"] Oct 01 10:26:28 crc kubenswrapper[4735]: W1001 10:26:28.543146 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4f1a7ac_f922_4a82_8675_c87e0921512f.slice/crio-3143c6c03346a16d11a27985e92dbe8b82f1ada1e59fd7a8f852e20c56b6b80d WatchSource:0}: Error finding container 3143c6c03346a16d11a27985e92dbe8b82f1ada1e59fd7a8f852e20c56b6b80d: Status 404 returned error can't find the container with id 3143c6c03346a16d11a27985e92dbe8b82f1ada1e59fd7a8f852e20c56b6b80d Oct 01 10:26:28 crc kubenswrapper[4735]: I1001 10:26:28.544605 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-srpsm"] Oct 01 10:26:28 crc kubenswrapper[4735]: W1001 10:26:28.547205 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a6804d5_21c5_4d3d_9504_d769df881c52.slice/crio-f9c7777d04012641f2d18f1309beeb5872f7a3da30e9c9647dacadbb868d03ff WatchSource:0}: Error finding container f9c7777d04012641f2d18f1309beeb5872f7a3da30e9c9647dacadbb868d03ff: Status 404 returned error can't find the container with id f9c7777d04012641f2d18f1309beeb5872f7a3da30e9c9647dacadbb868d03ff Oct 01 10:26:29 crc kubenswrapper[4735]: I1001 10:26:29.359294 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" event={"ID":"a4f1a7ac-f922-4a82-8675-c87e0921512f","Type":"ContainerStarted","Data":"3143c6c03346a16d11a27985e92dbe8b82f1ada1e59fd7a8f852e20c56b6b80d"} Oct 01 10:26:29 crc kubenswrapper[4735]: I1001 10:26:29.361536 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-srpsm" event={"ID":"4a6804d5-21c5-4d3d-9504-d769df881c52","Type":"ContainerStarted","Data":"f9c7777d04012641f2d18f1309beeb5872f7a3da30e9c9647dacadbb868d03ff"} Oct 01 10:26:31 crc kubenswrapper[4735]: I1001 10:26:31.373533 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mvxdt" event={"ID":"63d28348-4347-431a-97e8-3526e9f66a68","Type":"ContainerStarted","Data":"247de607a34bf5993958461ac681c6163dbc47c2061099ddfcc9dc7de79ba714"} Oct 01 10:26:31 crc kubenswrapper[4735]: I1001 10:26:31.386889 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-mvxdt" podStartSLOduration=2.268966518 podStartE2EDuration="4.386873316s" podCreationTimestamp="2025-10-01 10:26:27 +0000 UTC" firstStartedPulling="2025-10-01 10:26:28.292149254 +0000 UTC m=+546.984970516" lastFinishedPulling="2025-10-01 10:26:30.410056012 +0000 UTC m=+549.102877314" observedRunningTime="2025-10-01 10:26:31.386728932 +0000 UTC m=+550.079550194" watchObservedRunningTime="2025-10-01 10:26:31.386873316 +0000 UTC m=+550.079694578" Oct 01 10:26:35 crc kubenswrapper[4735]: I1001 10:26:35.486247 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:26:35 crc kubenswrapper[4735]: I1001 10:26:35.486723 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:26:38 crc kubenswrapper[4735]: I1001 10:26:38.476796 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-k5mgz"] Oct 01 10:26:38 crc kubenswrapper[4735]: I1001 10:26:38.477566 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovn-controller" containerID="cri-o://54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82" gracePeriod=30 Oct 01 10:26:38 crc kubenswrapper[4735]: I1001 10:26:38.477692 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="sbdb" containerID="cri-o://45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724" gracePeriod=30 Oct 01 10:26:38 crc kubenswrapper[4735]: I1001 10:26:38.477770 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="nbdb" containerID="cri-o://efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b" gracePeriod=30 Oct 01 10:26:38 crc kubenswrapper[4735]: I1001 10:26:38.477810 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="northd" containerID="cri-o://a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1" gracePeriod=30 Oct 01 10:26:38 crc kubenswrapper[4735]: I1001 10:26:38.477851 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55" gracePeriod=30 Oct 01 10:26:38 crc kubenswrapper[4735]: I1001 10:26:38.477890 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="kube-rbac-proxy-node" containerID="cri-o://033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5" gracePeriod=30 Oct 01 10:26:38 crc kubenswrapper[4735]: I1001 10:26:38.477925 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovn-acl-logging" containerID="cri-o://7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766" gracePeriod=30 Oct 01 10:26:38 crc kubenswrapper[4735]: I1001 10:26:38.513448 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" containerID="cri-o://83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2" gracePeriod=30 Oct 01 10:26:38 crc kubenswrapper[4735]: E1001 10:26:38.648813 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/jetstack/cert-manager-webhook:v1.14.4: Requesting bearer token: invalid status code from registry 504 (Gateway Timeout)" image="quay.io/jetstack/cert-manager-webhook:v1.14.4" Oct 01 10:26:38 crc kubenswrapper[4735]: E1001 10:26:38.649013 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cert-manager-webhook,Image:quay.io/jetstack/cert-manager-webhook:v1.14.4,Command:[],Args:[--v=2 --secure-port=10250 --dynamic-serving-ca-secret-namespace=$(POD_NAMESPACE) --dynamic-serving-ca-secret-name=cert-manager-webhook-ca --dynamic-serving-dns-names=cert-manager-webhook --dynamic-serving-dns-names=cert-manager-webhook.$(POD_NAMESPACE) --dynamic-serving-dns-names=cert-manager-webhook.$(POD_NAMESPACE).svc],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:10250,Protocol:TCP,HostIP:,},ContainerPort{Name:healthcheck,HostPort:0,ContainerPort:6080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lxvsh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 6080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:60,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 6080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cert-manager-webhook-5655c58dd6-d2xk7_cert-manager(a4f1a7ac-f922-4a82-8675-c87e0921512f): ErrImagePull: initializing source docker://quay.io/jetstack/cert-manager-webhook:v1.14.4: Requesting bearer token: invalid status code from registry 504 (Gateway Timeout)" logger="UnhandledError" Oct 01 10:26:38 crc kubenswrapper[4735]: E1001 10:26:38.650257 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-webhook\" with ErrImagePull: \"initializing source docker://quay.io/jetstack/cert-manager-webhook:v1.14.4: Requesting bearer token: invalid status code from registry 504 (Gateway Timeout)\"" pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" podUID="a4f1a7ac-f922-4a82-8675-c87e0921512f" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.239615 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/3.log" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.242964 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovn-acl-logging/0.log" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.243490 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovn-controller/0.log" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.243901 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.308657 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zdqk4"] Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.308899 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.308915 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.308930 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="kube-rbac-proxy-node" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.308938 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="kube-rbac-proxy-node" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.308949 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.308959 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.308971 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="nbdb" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.308979 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="nbdb" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.308989 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="northd" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.308997 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="northd" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.309008 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="sbdb" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309015 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="sbdb" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.309028 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovn-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309036 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovn-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.309049 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovn-acl-logging" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309057 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovn-acl-logging" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.309068 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="kubecfg-setup" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309076 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="kubecfg-setup" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.309087 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309096 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.309104 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309112 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.309122 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309129 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309260 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309274 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="sbdb" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309288 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="kube-rbac-proxy-node" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309300 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309309 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309321 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="northd" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309332 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovn-acl-logging" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309341 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309353 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309365 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309375 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="nbdb" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309387 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovn-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.309537 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.309551 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f1531f-5034-48d4-b694-efc774226e37" containerName="ovnkube-controller" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.312074 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313685 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-kubelet\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313713 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-bin\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313740 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-systemd-units\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313762 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-config\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313783 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-var-lib-cni-networks-ovn-kubernetes\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313806 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-systemd\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313830 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-openvswitch\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313810 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313853 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-script-lib\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313893 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-var-lib-openvswitch\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313847 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313848 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313949 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313876 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313902 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313979 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.313922 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-netd\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314066 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32f1531f-5034-48d4-b694-efc774226e37-ovn-node-metrics-cert\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314108 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-netns\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314164 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-env-overrides\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314192 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-log-socket\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314243 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqpq8\" (UniqueName: \"kubernetes.io/projected/32f1531f-5034-48d4-b694-efc774226e37-kube-api-access-fqpq8\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314274 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-ovn\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314320 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-ovn-kubernetes\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314326 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314354 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-node-log\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314399 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-slash\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314430 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-etc-openvswitch\") pod \"32f1531f-5034-48d4-b694-efc774226e37\" (UID: \"32f1531f-5034-48d4-b694-efc774226e37\") " Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315020 4735 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314355 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315052 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315077 4735 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315102 4735 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315129 4735 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315151 4735 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314379 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315173 4735 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315201 4735 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314423 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-log-socket" (OuterVolumeSpecName: "log-socket") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314703 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.314984 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315014 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-node-log" (OuterVolumeSpecName: "node-log") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315030 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315070 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-slash" (OuterVolumeSpecName: "host-slash") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.315091 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.319894 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f1531f-5034-48d4-b694-efc774226e37-kube-api-access-fqpq8" (OuterVolumeSpecName: "kube-api-access-fqpq8") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "kube-api-access-fqpq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.320813 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f1531f-5034-48d4-b694-efc774226e37-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.341903 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "32f1531f-5034-48d4-b694-efc774226e37" (UID: "32f1531f-5034-48d4-b694-efc774226e37"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.415863 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-var-lib-openvswitch\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.415903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-env-overrides\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.415923 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-node-log\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.415946 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416020 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-ovnkube-script-lib\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416045 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-log-socket\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416068 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-ovn-node-metrics-cert\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416097 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-run-ovn-kubernetes\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416111 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-systemd-units\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416126 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-etc-openvswitch\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416144 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-ovnkube-config\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416161 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-cni-netd\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416178 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-run-netns\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416194 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-kubelet\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416217 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-slash\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-cni-bin\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416259 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-run-openvswitch\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416304 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-run-ovn\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416323 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-run-systemd\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416343 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxgz6\" (UniqueName: \"kubernetes.io/projected/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-kube-api-access-xxgz6\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416395 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqpq8\" (UniqueName: \"kubernetes.io/projected/32f1531f-5034-48d4-b694-efc774226e37-kube-api-access-fqpq8\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416405 4735 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416416 4735 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416425 4735 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-node-log\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416433 4735 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-slash\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416441 4735 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416450 4735 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416459 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416467 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32f1531f-5034-48d4-b694-efc774226e37-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416476 4735 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416484 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/32f1531f-5034-48d4-b694-efc774226e37-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.416510 4735 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/32f1531f-5034-48d4-b694-efc774226e37-log-socket\") on node \"crc\" DevicePath \"\"" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.423625 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dz9b_5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7/kube-multus/1.log" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.424232 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dz9b_5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7/kube-multus/0.log" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.424265 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7" containerID="1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556" exitCode=2 Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.424333 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dz9b" event={"ID":"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7","Type":"ContainerDied","Data":"1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.424364 4735 scope.go:117] "RemoveContainer" containerID="cba3ee0ebe54c6342e6e3cbe579f04d1d97fede47be32bcb7ecac01696d8eb23" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.424859 4735 scope.go:117] "RemoveContainer" containerID="1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.425128 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8dz9b_openshift-multus(5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7)\"" pod="openshift-multus/multus-8dz9b" podUID="5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.426918 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovnkube-controller/3.log" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.430044 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovn-acl-logging/0.log" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.430575 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k5mgz_32f1531f-5034-48d4-b694-efc774226e37/ovn-controller/0.log" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.430867 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2" exitCode=0 Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.430894 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724" exitCode=0 Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.430889 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.430941 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.430966 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.430906 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b" exitCode=0 Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431006 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431019 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431006 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1" exitCode=0 Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431132 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55" exitCode=0 Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431149 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5" exitCode=0 Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431164 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766" exitCode=143 Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431179 4735 generic.go:334] "Generic (PLEG): container finished" podID="32f1531f-5034-48d4-b694-efc774226e37" containerID="54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82" exitCode=143 Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431201 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431223 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431236 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431246 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431252 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431257 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431263 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431268 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431273 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431278 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431284 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431289 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431296 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431303 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431308 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431314 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431319 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431323 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431328 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431334 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431339 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431344 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431350 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431358 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431367 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431375 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431382 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431388 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431394 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431400 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431406 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431413 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431419 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431426 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431437 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k5mgz" event={"ID":"32f1531f-5034-48d4-b694-efc774226e37","Type":"ContainerDied","Data":"e73b6c84ab970e729df6b604555f77f77039617e883743768caa22d97e5dd9fc"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431448 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431456 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431463 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431470 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431476 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431482 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431489 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431518 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431525 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82"} Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.431532 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1"} Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.434827 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/jetstack/cert-manager-webhook:v1.14.4\\\"\"" pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" podUID="a4f1a7ac-f922-4a82-8675-c87e0921512f" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.485636 4735 scope.go:117] "RemoveContainer" containerID="83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.497682 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-k5mgz"] Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.502168 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-k5mgz"] Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.507968 4735 scope.go:117] "RemoveContainer" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.517731 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-ovnkube-config\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.517782 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-cni-netd\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.517828 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-run-netns\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.517846 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-kubelet\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.517863 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-slash\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.517881 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-run-openvswitch\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.517899 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-cni-bin\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.517924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-run-ovn\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.517948 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-run-systemd\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.517977 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxgz6\" (UniqueName: \"kubernetes.io/projected/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-kube-api-access-xxgz6\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.517989 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-slash\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518007 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-run-netns\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518048 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-cni-netd\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518000 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-var-lib-openvswitch\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518071 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-run-ovn\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518086 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-cni-bin\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518078 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-kubelet\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518108 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-var-lib-openvswitch\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518140 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-run-openvswitch\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518137 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-env-overrides\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518170 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-run-systemd\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518330 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-node-log\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518370 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518400 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-ovnkube-script-lib\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518419 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-log-socket\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518421 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518441 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-ovn-node-metrics-cert\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518479 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-run-ovn-kubernetes\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-node-log\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518540 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-systemd-units\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518561 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-ovnkube-config\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518595 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-log-socket\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518523 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-systemd-units\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518619 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-host-run-ovn-kubernetes\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518671 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-etc-openvswitch\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.518820 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-etc-openvswitch\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.519156 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-ovnkube-script-lib\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.519551 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-env-overrides\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.526376 4735 scope.go:117] "RemoveContainer" containerID="45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.526999 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-ovn-node-metrics-cert\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.535232 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxgz6\" (UniqueName: \"kubernetes.io/projected/8fed40d1-c7e7-4806-8afa-dbcab61c1fc2-kube-api-access-xxgz6\") pod \"ovnkube-node-zdqk4\" (UID: \"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.540590 4735 scope.go:117] "RemoveContainer" containerID="efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.553301 4735 scope.go:117] "RemoveContainer" containerID="a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.564103 4735 scope.go:117] "RemoveContainer" containerID="cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.573532 4735 scope.go:117] "RemoveContainer" containerID="033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.584418 4735 scope.go:117] "RemoveContainer" containerID="7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.610585 4735 scope.go:117] "RemoveContainer" containerID="54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.626755 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.629639 4735 scope.go:117] "RemoveContainer" containerID="b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.647924 4735 scope.go:117] "RemoveContainer" containerID="83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.648355 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2\": container with ID starting with 83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2 not found: ID does not exist" containerID="83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.648403 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2"} err="failed to get container status \"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2\": rpc error: code = NotFound desc = could not find container \"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2\": container with ID starting with 83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.648432 4735 scope.go:117] "RemoveContainer" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.648915 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\": container with ID starting with 2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0 not found: ID does not exist" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.648945 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0"} err="failed to get container status \"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\": rpc error: code = NotFound desc = could not find container \"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\": container with ID starting with 2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.648959 4735 scope.go:117] "RemoveContainer" containerID="45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.649277 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\": container with ID starting with 45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724 not found: ID does not exist" containerID="45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.649303 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724"} err="failed to get container status \"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\": rpc error: code = NotFound desc = could not find container \"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\": container with ID starting with 45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.649321 4735 scope.go:117] "RemoveContainer" containerID="efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.649600 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\": container with ID starting with efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b not found: ID does not exist" containerID="efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.649624 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b"} err="failed to get container status \"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\": rpc error: code = NotFound desc = could not find container \"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\": container with ID starting with efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.649643 4735 scope.go:117] "RemoveContainer" containerID="a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.650055 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\": container with ID starting with a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1 not found: ID does not exist" containerID="a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.650106 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1"} err="failed to get container status \"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\": rpc error: code = NotFound desc = could not find container \"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\": container with ID starting with a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.650142 4735 scope.go:117] "RemoveContainer" containerID="cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.650457 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\": container with ID starting with cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55 not found: ID does not exist" containerID="cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.650489 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55"} err="failed to get container status \"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\": rpc error: code = NotFound desc = could not find container \"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\": container with ID starting with cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.650525 4735 scope.go:117] "RemoveContainer" containerID="033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.650778 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\": container with ID starting with 033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5 not found: ID does not exist" containerID="033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.650804 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5"} err="failed to get container status \"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\": rpc error: code = NotFound desc = could not find container \"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\": container with ID starting with 033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.650819 4735 scope.go:117] "RemoveContainer" containerID="7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.651122 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\": container with ID starting with 7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766 not found: ID does not exist" containerID="7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.651153 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766"} err="failed to get container status \"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\": rpc error: code = NotFound desc = could not find container \"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\": container with ID starting with 7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.651177 4735 scope.go:117] "RemoveContainer" containerID="54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.651425 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\": container with ID starting with 54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82 not found: ID does not exist" containerID="54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.651457 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82"} err="failed to get container status \"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\": rpc error: code = NotFound desc = could not find container \"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\": container with ID starting with 54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.651469 4735 scope.go:117] "RemoveContainer" containerID="b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.651825 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\": container with ID starting with b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1 not found: ID does not exist" containerID="b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.651857 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1"} err="failed to get container status \"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\": rpc error: code = NotFound desc = could not find container \"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\": container with ID starting with b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.651878 4735 scope.go:117] "RemoveContainer" containerID="83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.652201 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2"} err="failed to get container status \"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2\": rpc error: code = NotFound desc = could not find container \"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2\": container with ID starting with 83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.652242 4735 scope.go:117] "RemoveContainer" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.652559 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0"} err="failed to get container status \"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\": rpc error: code = NotFound desc = could not find container \"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\": container with ID starting with 2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.652589 4735 scope.go:117] "RemoveContainer" containerID="45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.652844 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724"} err="failed to get container status \"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\": rpc error: code = NotFound desc = could not find container \"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\": container with ID starting with 45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.652873 4735 scope.go:117] "RemoveContainer" containerID="efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.653121 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b"} err="failed to get container status \"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\": rpc error: code = NotFound desc = could not find container \"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\": container with ID starting with efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.653147 4735 scope.go:117] "RemoveContainer" containerID="a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.653512 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1"} err="failed to get container status \"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\": rpc error: code = NotFound desc = could not find container \"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\": container with ID starting with a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.653542 4735 scope.go:117] "RemoveContainer" containerID="cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.653907 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55"} err="failed to get container status \"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\": rpc error: code = NotFound desc = could not find container \"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\": container with ID starting with cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.653931 4735 scope.go:117] "RemoveContainer" containerID="033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.655032 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5"} err="failed to get container status \"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\": rpc error: code = NotFound desc = could not find container \"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\": container with ID starting with 033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.655516 4735 scope.go:117] "RemoveContainer" containerID="7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.655946 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766"} err="failed to get container status \"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\": rpc error: code = NotFound desc = could not find container \"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\": container with ID starting with 7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.655987 4735 scope.go:117] "RemoveContainer" containerID="54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.656381 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82"} err="failed to get container status \"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\": rpc error: code = NotFound desc = could not find container \"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\": container with ID starting with 54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.656401 4735 scope.go:117] "RemoveContainer" containerID="b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.656754 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1"} err="failed to get container status \"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\": rpc error: code = NotFound desc = could not find container \"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\": container with ID starting with b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.656792 4735 scope.go:117] "RemoveContainer" containerID="83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.657097 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2"} err="failed to get container status \"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2\": rpc error: code = NotFound desc = could not find container \"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2\": container with ID starting with 83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.657138 4735 scope.go:117] "RemoveContainer" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.658100 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0"} err="failed to get container status \"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\": rpc error: code = NotFound desc = could not find container \"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\": container with ID starting with 2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.658122 4735 scope.go:117] "RemoveContainer" containerID="45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.658513 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724"} err="failed to get container status \"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\": rpc error: code = NotFound desc = could not find container \"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\": container with ID starting with 45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.658536 4735 scope.go:117] "RemoveContainer" containerID="efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.659093 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b"} err="failed to get container status \"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\": rpc error: code = NotFound desc = could not find container \"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\": container with ID starting with efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.659115 4735 scope.go:117] "RemoveContainer" containerID="a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.659457 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1"} err="failed to get container status \"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\": rpc error: code = NotFound desc = could not find container \"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\": container with ID starting with a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.659479 4735 scope.go:117] "RemoveContainer" containerID="cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.660014 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55"} err="failed to get container status \"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\": rpc error: code = NotFound desc = could not find container \"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\": container with ID starting with cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.660038 4735 scope.go:117] "RemoveContainer" containerID="033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.660987 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5"} err="failed to get container status \"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\": rpc error: code = NotFound desc = could not find container \"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\": container with ID starting with 033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.661024 4735 scope.go:117] "RemoveContainer" containerID="7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.661301 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766"} err="failed to get container status \"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\": rpc error: code = NotFound desc = could not find container \"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\": container with ID starting with 7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.661329 4735 scope.go:117] "RemoveContainer" containerID="54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.661619 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82"} err="failed to get container status \"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\": rpc error: code = NotFound desc = could not find container \"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\": container with ID starting with 54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.661652 4735 scope.go:117] "RemoveContainer" containerID="b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.661874 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1"} err="failed to get container status \"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\": rpc error: code = NotFound desc = could not find container \"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\": container with ID starting with b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.661890 4735 scope.go:117] "RemoveContainer" containerID="83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.662250 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2"} err="failed to get container status \"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2\": rpc error: code = NotFound desc = could not find container \"83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2\": container with ID starting with 83ba1e5271b7461df27551275fcd38bfd0342e811f0565548c17fd46d0c947b2 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.662274 4735 scope.go:117] "RemoveContainer" containerID="2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.662517 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0"} err="failed to get container status \"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\": rpc error: code = NotFound desc = could not find container \"2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0\": container with ID starting with 2bdc2689015b6cb936075aeb691bbcb67a30f3983e6aeb50f78eaafde01a4cf0 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.662543 4735 scope.go:117] "RemoveContainer" containerID="45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.662877 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724"} err="failed to get container status \"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\": rpc error: code = NotFound desc = could not find container \"45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724\": container with ID starting with 45e23c97660e0d434ac6e101416c28c2b6aec5a30b0a5d903febe9d4bbed8724 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.662901 4735 scope.go:117] "RemoveContainer" containerID="efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.663336 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b"} err="failed to get container status \"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\": rpc error: code = NotFound desc = could not find container \"efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b\": container with ID starting with efb5598cd0106c03c67089cbb719f570c5944ce4fbfb9196f050bc6c731fb12b not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.663360 4735 scope.go:117] "RemoveContainer" containerID="a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.663879 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1"} err="failed to get container status \"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\": rpc error: code = NotFound desc = could not find container \"a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1\": container with ID starting with a90fbfd2919322bfb3e478408814e2146c2141683968024d973efc6c5ec8deb1 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.663905 4735 scope.go:117] "RemoveContainer" containerID="cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.664222 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55"} err="failed to get container status \"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\": rpc error: code = NotFound desc = could not find container \"cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55\": container with ID starting with cee25e860cd3ef8470ff16d761967ccfca57193d4981a0885f47d77b3a7e6b55 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.664270 4735 scope.go:117] "RemoveContainer" containerID="033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.665333 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5"} err="failed to get container status \"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\": rpc error: code = NotFound desc = could not find container \"033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5\": container with ID starting with 033b8b21d1623d9b2104670656da992c187cc109de8f576cbea04ecd818bc4b5 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.665370 4735 scope.go:117] "RemoveContainer" containerID="7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.665702 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766"} err="failed to get container status \"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\": rpc error: code = NotFound desc = could not find container \"7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766\": container with ID starting with 7f8329601fcee26eeb7ed0297d3320fe27ee52e3512115539b5482d47d82d766 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.665744 4735 scope.go:117] "RemoveContainer" containerID="54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.666003 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82"} err="failed to get container status \"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\": rpc error: code = NotFound desc = could not find container \"54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82\": container with ID starting with 54d9813178ee5ad53bac90b272c82f43ad3331ea303f3c7b389c5ef9b6b77f82 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.666031 4735 scope.go:117] "RemoveContainer" containerID="b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.666697 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1"} err="failed to get container status \"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\": rpc error: code = NotFound desc = could not find container \"b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1\": container with ID starting with b4fe4820f7e603685c026d7e5a153c92719c8e671cf6173714bf81d2dba033a1 not found: ID does not exist" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.755811 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading blob sha256:8686cf6e3c7a4d0ae02b61306513ad382b23f1f780a558793e77fda9217269e2: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" image="quay.io/jetstack/cert-manager-controller:v1.14.4" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.755971 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cert-manager-controller,Image:quay.io/jetstack/cert-manager-controller:v1.14.4,Command:[],Args:[--v=2 --cluster-resource-namespace=$(POD_NAMESPACE) --leader-election-namespace=kube-system --acme-http01-solver-image=quay.io/jetstack/cert-manager-acmesolver:v1.14.4 --max-concurrent-challenges=60],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:9402,Protocol:TCP,HostIP:,},ContainerPort{Name:http-healthz,HostPort:0,ContainerPort:9403,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nz6pw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{1 0 http-healthz},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:15,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:8,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cert-manager-5b446d88c5-srpsm_cert-manager(4a6804d5-21c5-4d3d-9504-d769df881c52): ErrImagePull: copying system image from manifest list: reading blob sha256:8686cf6e3c7a4d0ae02b61306513ad382b23f1f780a558793e77fda9217269e2: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" logger="UnhandledError" Oct 01 10:26:39 crc kubenswrapper[4735]: E1001 10:26:39.757230 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-controller\" with ErrImagePull: \"copying system image from manifest list: reading blob sha256:8686cf6e3c7a4d0ae02b61306513ad382b23f1f780a558793e77fda9217269e2: fetching blob: received unexpected HTTP status: 504 Gateway Time-out\"" pod="cert-manager/cert-manager-5b446d88c5-srpsm" podUID="4a6804d5-21c5-4d3d-9504-d769df881c52" Oct 01 10:26:39 crc kubenswrapper[4735]: I1001 10:26:39.902088 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32f1531f-5034-48d4-b694-efc774226e37" path="/var/lib/kubelet/pods/32f1531f-5034-48d4-b694-efc774226e37/volumes" Oct 01 10:26:40 crc kubenswrapper[4735]: I1001 10:26:40.441092 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dz9b_5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7/kube-multus/1.log" Oct 01 10:26:40 crc kubenswrapper[4735]: I1001 10:26:40.442896 4735 generic.go:334] "Generic (PLEG): container finished" podID="8fed40d1-c7e7-4806-8afa-dbcab61c1fc2" containerID="e373ccd4b7f28e759bb9390207c7cb825b1d566b26e76836b78d72178d4018b1" exitCode=0 Oct 01 10:26:40 crc kubenswrapper[4735]: I1001 10:26:40.442924 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" event={"ID":"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2","Type":"ContainerDied","Data":"e373ccd4b7f28e759bb9390207c7cb825b1d566b26e76836b78d72178d4018b1"} Oct 01 10:26:40 crc kubenswrapper[4735]: I1001 10:26:40.442968 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" event={"ID":"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2","Type":"ContainerStarted","Data":"8ef3b83627ec4b4856c251637a0e3f88faf86f630392bd7db9e81826203a0150"} Oct 01 10:26:40 crc kubenswrapper[4735]: E1001 10:26:40.444390 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cert-manager-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/jetstack/cert-manager-controller:v1.14.4\\\"\"" pod="cert-manager/cert-manager-5b446d88c5-srpsm" podUID="4a6804d5-21c5-4d3d-9504-d769df881c52" Oct 01 10:26:41 crc kubenswrapper[4735]: I1001 10:26:41.465876 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" event={"ID":"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2","Type":"ContainerStarted","Data":"2bce90184d2542bb6666672122e0fc69b9520deb958df36e1db3a793ab4e8123"} Oct 01 10:26:41 crc kubenswrapper[4735]: I1001 10:26:41.466199 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" event={"ID":"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2","Type":"ContainerStarted","Data":"ebb782ba8b575bd26160b6df7a0ba59083100845b71381ba8fbe787f28e2c284"} Oct 01 10:26:41 crc kubenswrapper[4735]: I1001 10:26:41.466210 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" event={"ID":"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2","Type":"ContainerStarted","Data":"2d846bf7fd587b50f43d89ea6094463c50023fb53e9c152741851244d2061781"} Oct 01 10:26:41 crc kubenswrapper[4735]: I1001 10:26:41.466218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" event={"ID":"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2","Type":"ContainerStarted","Data":"2983ee0080be94c3bf624292d02d843bfdc45c1ffe99c8c5f77ef579cdb74be7"} Oct 01 10:26:41 crc kubenswrapper[4735]: I1001 10:26:41.466226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" event={"ID":"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2","Type":"ContainerStarted","Data":"8d0213de4b6aa71697c1ff4d6e2dfaf2d906254e92a98916da104b3be6c94453"} Oct 01 10:26:41 crc kubenswrapper[4735]: I1001 10:26:41.466234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" event={"ID":"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2","Type":"ContainerStarted","Data":"1c52f6f98a9446d01c118dce16e481ef72ecb3893ed9ec6e19e7bc6fc8a2ff7a"} Oct 01 10:26:43 crc kubenswrapper[4735]: I1001 10:26:43.484940 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" event={"ID":"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2","Type":"ContainerStarted","Data":"0b0fc7aebbc4981b82c41ba0de1dfd1e52925e911cfc7f8dce92bb1f02caf2f0"} Oct 01 10:26:46 crc kubenswrapper[4735]: I1001 10:26:46.503830 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" event={"ID":"8fed40d1-c7e7-4806-8afa-dbcab61c1fc2","Type":"ContainerStarted","Data":"4aab3f547d28537e81993d6101aadfd4aa0fca80f9396553b37749dfb2a75ad2"} Oct 01 10:26:46 crc kubenswrapper[4735]: I1001 10:26:46.504398 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:46 crc kubenswrapper[4735]: I1001 10:26:46.504410 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:46 crc kubenswrapper[4735]: I1001 10:26:46.504418 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:46 crc kubenswrapper[4735]: I1001 10:26:46.552433 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:46 crc kubenswrapper[4735]: I1001 10:26:46.552886 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:26:46 crc kubenswrapper[4735]: I1001 10:26:46.583995 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" podStartSLOduration=7.583970025 podStartE2EDuration="7.583970025s" podCreationTimestamp="2025-10-01 10:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:26:46.551601299 +0000 UTC m=+565.244422591" watchObservedRunningTime="2025-10-01 10:26:46.583970025 +0000 UTC m=+565.276791298" Oct 01 10:26:52 crc kubenswrapper[4735]: I1001 10:26:52.535000 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" event={"ID":"a4f1a7ac-f922-4a82-8675-c87e0921512f","Type":"ContainerStarted","Data":"0aab064aad3ec2ba7ca470e19a8bfe6ebc05237d13fad8cfe1576f88a4ed8203"} Oct 01 10:26:52 crc kubenswrapper[4735]: I1001 10:26:52.535669 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" Oct 01 10:26:52 crc kubenswrapper[4735]: I1001 10:26:52.550737 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" podStartSLOduration=1.880047819 podStartE2EDuration="25.550717249s" podCreationTimestamp="2025-10-01 10:26:27 +0000 UTC" firstStartedPulling="2025-10-01 10:26:28.54526194 +0000 UTC m=+547.238083202" lastFinishedPulling="2025-10-01 10:26:52.21593137 +0000 UTC m=+570.908752632" observedRunningTime="2025-10-01 10:26:52.550033639 +0000 UTC m=+571.242854901" watchObservedRunningTime="2025-10-01 10:26:52.550717249 +0000 UTC m=+571.243538511" Oct 01 10:26:54 crc kubenswrapper[4735]: I1001 10:26:54.897485 4735 scope.go:117] "RemoveContainer" containerID="1dc2735dc0ade0160be84ac2d38365c7691541f3aeda56026e5489a2a2b6a556" Oct 01 10:26:55 crc kubenswrapper[4735]: I1001 10:26:55.555829 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dz9b_5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7/kube-multus/1.log" Oct 01 10:26:55 crc kubenswrapper[4735]: I1001 10:26:55.556307 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dz9b" event={"ID":"5d8707c5-6fad-4ba7-b2ea-a0916dd86bf7","Type":"ContainerStarted","Data":"9a792d854ca22fcfcb8544ee8fc4217735ded4757e1314de5dd1f86921b16a5e"} Oct 01 10:26:57 crc kubenswrapper[4735]: I1001 10:26:57.570087 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-srpsm" event={"ID":"4a6804d5-21c5-4d3d-9504-d769df881c52","Type":"ContainerStarted","Data":"3d5f8b71136a60a586bed7c7f73bbb868aa13724030a1f0c218c8d24f0a94790"} Oct 01 10:26:57 crc kubenswrapper[4735]: I1001 10:26:57.592925 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-srpsm" podStartSLOduration=2.618345587 podStartE2EDuration="30.592904765s" podCreationTimestamp="2025-10-01 10:26:27 +0000 UTC" firstStartedPulling="2025-10-01 10:26:28.548692922 +0000 UTC m=+547.241514174" lastFinishedPulling="2025-10-01 10:26:56.52325209 +0000 UTC m=+575.216073352" observedRunningTime="2025-10-01 10:26:57.588938557 +0000 UTC m=+576.281759879" watchObservedRunningTime="2025-10-01 10:26:57.592904765 +0000 UTC m=+576.285726027" Oct 01 10:26:58 crc kubenswrapper[4735]: I1001 10:26:58.134531 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-d2xk7" Oct 01 10:27:05 crc kubenswrapper[4735]: I1001 10:27:05.486195 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:27:05 crc kubenswrapper[4735]: I1001 10:27:05.486747 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:27:05 crc kubenswrapper[4735]: I1001 10:27:05.486794 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:27:05 crc kubenswrapper[4735]: I1001 10:27:05.487383 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"218e50335c2b017baada525f436dc7da1909a27a86b76e27c3f9d13a94f70329"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 10:27:05 crc kubenswrapper[4735]: I1001 10:27:05.487441 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://218e50335c2b017baada525f436dc7da1909a27a86b76e27c3f9d13a94f70329" gracePeriod=600 Oct 01 10:27:06 crc kubenswrapper[4735]: I1001 10:27:06.636468 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="218e50335c2b017baada525f436dc7da1909a27a86b76e27c3f9d13a94f70329" exitCode=0 Oct 01 10:27:06 crc kubenswrapper[4735]: I1001 10:27:06.636558 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"218e50335c2b017baada525f436dc7da1909a27a86b76e27c3f9d13a94f70329"} Oct 01 10:27:06 crc kubenswrapper[4735]: I1001 10:27:06.637024 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"2af2e089638b91b79b45466ec706a3fb255cc6d4b97f5b95a6d6830b44a807b5"} Oct 01 10:27:06 crc kubenswrapper[4735]: I1001 10:27:06.637057 4735 scope.go:117] "RemoveContainer" containerID="37cba53e290bdc38e83217d84214f3378c3c1355865e08cfd659f1334766bc2e" Oct 01 10:27:09 crc kubenswrapper[4735]: I1001 10:27:09.653591 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zdqk4" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.243354 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn"] Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.244763 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.248391 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.257449 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn"] Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.386883 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.386975 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.387226 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjlnr\" (UniqueName: \"kubernetes.io/projected/b1cb929b-2595-4752-a499-91d2401f5755-kube-api-access-xjlnr\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.488678 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.488776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.488848 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjlnr\" (UniqueName: \"kubernetes.io/projected/b1cb929b-2595-4752-a499-91d2401f5755-kube-api-access-xjlnr\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.489600 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.489919 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.526416 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjlnr\" (UniqueName: \"kubernetes.io/projected/b1cb929b-2595-4752-a499-91d2401f5755-kube-api-access-xjlnr\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.561034 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:37 crc kubenswrapper[4735]: I1001 10:27:37.836455 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn"] Oct 01 10:27:38 crc kubenswrapper[4735]: I1001 10:27:38.829664 4735 generic.go:334] "Generic (PLEG): container finished" podID="b1cb929b-2595-4752-a499-91d2401f5755" containerID="dbcfb5d3aa335a2ad4e2ac75981164f49c092bae9034c840076f2cba4e05442a" exitCode=0 Oct 01 10:27:38 crc kubenswrapper[4735]: I1001 10:27:38.829726 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" event={"ID":"b1cb929b-2595-4752-a499-91d2401f5755","Type":"ContainerDied","Data":"dbcfb5d3aa335a2ad4e2ac75981164f49c092bae9034c840076f2cba4e05442a"} Oct 01 10:27:38 crc kubenswrapper[4735]: I1001 10:27:38.829769 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" event={"ID":"b1cb929b-2595-4752-a499-91d2401f5755","Type":"ContainerStarted","Data":"479d81657db948b5a30b2931ba5c5aa48a4b48cf3c3304ce841b1f3036e59a75"} Oct 01 10:27:40 crc kubenswrapper[4735]: I1001 10:27:40.843086 4735 generic.go:334] "Generic (PLEG): container finished" podID="b1cb929b-2595-4752-a499-91d2401f5755" containerID="08d458d125a632b8638bc20f843becfa9f524c467e798f49650b95567bce26d7" exitCode=0 Oct 01 10:27:40 crc kubenswrapper[4735]: I1001 10:27:40.843178 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" event={"ID":"b1cb929b-2595-4752-a499-91d2401f5755","Type":"ContainerDied","Data":"08d458d125a632b8638bc20f843becfa9f524c467e798f49650b95567bce26d7"} Oct 01 10:27:41 crc kubenswrapper[4735]: I1001 10:27:41.852807 4735 generic.go:334] "Generic (PLEG): container finished" podID="b1cb929b-2595-4752-a499-91d2401f5755" containerID="1107e23094e2dfcfca5feb45ada9812b9d12faafbc3df47d67934c5abac326f3" exitCode=0 Oct 01 10:27:41 crc kubenswrapper[4735]: I1001 10:27:41.852907 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" event={"ID":"b1cb929b-2595-4752-a499-91d2401f5755","Type":"ContainerDied","Data":"1107e23094e2dfcfca5feb45ada9812b9d12faafbc3df47d67934c5abac326f3"} Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.129320 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.276890 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-util\") pod \"b1cb929b-2595-4752-a499-91d2401f5755\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.277041 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjlnr\" (UniqueName: \"kubernetes.io/projected/b1cb929b-2595-4752-a499-91d2401f5755-kube-api-access-xjlnr\") pod \"b1cb929b-2595-4752-a499-91d2401f5755\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.277083 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-bundle\") pod \"b1cb929b-2595-4752-a499-91d2401f5755\" (UID: \"b1cb929b-2595-4752-a499-91d2401f5755\") " Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.277696 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-bundle" (OuterVolumeSpecName: "bundle") pod "b1cb929b-2595-4752-a499-91d2401f5755" (UID: "b1cb929b-2595-4752-a499-91d2401f5755"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.282058 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1cb929b-2595-4752-a499-91d2401f5755-kube-api-access-xjlnr" (OuterVolumeSpecName: "kube-api-access-xjlnr") pod "b1cb929b-2595-4752-a499-91d2401f5755" (UID: "b1cb929b-2595-4752-a499-91d2401f5755"). InnerVolumeSpecName "kube-api-access-xjlnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.300010 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-util" (OuterVolumeSpecName: "util") pod "b1cb929b-2595-4752-a499-91d2401f5755" (UID: "b1cb929b-2595-4752-a499-91d2401f5755"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.378437 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-util\") on node \"crc\" DevicePath \"\"" Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.378553 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjlnr\" (UniqueName: \"kubernetes.io/projected/b1cb929b-2595-4752-a499-91d2401f5755-kube-api-access-xjlnr\") on node \"crc\" DevicePath \"\"" Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.378590 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1cb929b-2595-4752-a499-91d2401f5755-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.869916 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" event={"ID":"b1cb929b-2595-4752-a499-91d2401f5755","Type":"ContainerDied","Data":"479d81657db948b5a30b2931ba5c5aa48a4b48cf3c3304ce841b1f3036e59a75"} Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.869992 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="479d81657db948b5a30b2931ba5c5aa48a4b48cf3c3304ce841b1f3036e59a75" Oct 01 10:27:43 crc kubenswrapper[4735]: I1001 10:27:43.870239 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn" Oct 01 10:27:44 crc kubenswrapper[4735]: I1001 10:27:44.896174 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-m6pnj"] Oct 01 10:27:44 crc kubenswrapper[4735]: E1001 10:27:44.896759 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1cb929b-2595-4752-a499-91d2401f5755" containerName="pull" Oct 01 10:27:44 crc kubenswrapper[4735]: I1001 10:27:44.896777 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cb929b-2595-4752-a499-91d2401f5755" containerName="pull" Oct 01 10:27:44 crc kubenswrapper[4735]: E1001 10:27:44.896794 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1cb929b-2595-4752-a499-91d2401f5755" containerName="util" Oct 01 10:27:44 crc kubenswrapper[4735]: I1001 10:27:44.896802 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cb929b-2595-4752-a499-91d2401f5755" containerName="util" Oct 01 10:27:44 crc kubenswrapper[4735]: E1001 10:27:44.896818 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1cb929b-2595-4752-a499-91d2401f5755" containerName="extract" Oct 01 10:27:44 crc kubenswrapper[4735]: I1001 10:27:44.896826 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cb929b-2595-4752-a499-91d2401f5755" containerName="extract" Oct 01 10:27:44 crc kubenswrapper[4735]: I1001 10:27:44.896975 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1cb929b-2595-4752-a499-91d2401f5755" containerName="extract" Oct 01 10:27:44 crc kubenswrapper[4735]: I1001 10:27:44.897817 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m6pnj" Oct 01 10:27:44 crc kubenswrapper[4735]: I1001 10:27:44.901337 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-4b25v" Oct 01 10:27:44 crc kubenswrapper[4735]: I1001 10:27:44.903277 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 01 10:27:44 crc kubenswrapper[4735]: I1001 10:27:44.906538 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 01 10:27:44 crc kubenswrapper[4735]: I1001 10:27:44.911102 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-m6pnj"] Oct 01 10:27:45 crc kubenswrapper[4735]: I1001 10:27:45.001668 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnxqp\" (UniqueName: \"kubernetes.io/projected/7da3bf5b-a383-430c-b587-62c7eabeedd1-kube-api-access-lnxqp\") pod \"nmstate-operator-5d6f6cfd66-m6pnj\" (UID: \"7da3bf5b-a383-430c-b587-62c7eabeedd1\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m6pnj" Oct 01 10:27:45 crc kubenswrapper[4735]: I1001 10:27:45.103039 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnxqp\" (UniqueName: \"kubernetes.io/projected/7da3bf5b-a383-430c-b587-62c7eabeedd1-kube-api-access-lnxqp\") pod \"nmstate-operator-5d6f6cfd66-m6pnj\" (UID: \"7da3bf5b-a383-430c-b587-62c7eabeedd1\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m6pnj" Oct 01 10:27:45 crc kubenswrapper[4735]: I1001 10:27:45.121147 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnxqp\" (UniqueName: \"kubernetes.io/projected/7da3bf5b-a383-430c-b587-62c7eabeedd1-kube-api-access-lnxqp\") pod \"nmstate-operator-5d6f6cfd66-m6pnj\" (UID: \"7da3bf5b-a383-430c-b587-62c7eabeedd1\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m6pnj" Oct 01 10:27:45 crc kubenswrapper[4735]: I1001 10:27:45.217919 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m6pnj" Oct 01 10:27:45 crc kubenswrapper[4735]: I1001 10:27:45.442198 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-m6pnj"] Oct 01 10:27:45 crc kubenswrapper[4735]: W1001 10:27:45.448897 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da3bf5b_a383_430c_b587_62c7eabeedd1.slice/crio-0060ee1fde1cefdd0be2846a86d1d2769e762f4be5545fd518d47653242db809 WatchSource:0}: Error finding container 0060ee1fde1cefdd0be2846a86d1d2769e762f4be5545fd518d47653242db809: Status 404 returned error can't find the container with id 0060ee1fde1cefdd0be2846a86d1d2769e762f4be5545fd518d47653242db809 Oct 01 10:27:45 crc kubenswrapper[4735]: I1001 10:27:45.885318 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m6pnj" event={"ID":"7da3bf5b-a383-430c-b587-62c7eabeedd1","Type":"ContainerStarted","Data":"0060ee1fde1cefdd0be2846a86d1d2769e762f4be5545fd518d47653242db809"} Oct 01 10:27:48 crc kubenswrapper[4735]: I1001 10:27:48.908250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m6pnj" event={"ID":"7da3bf5b-a383-430c-b587-62c7eabeedd1","Type":"ContainerStarted","Data":"d76491b3c9cf30ccb083def6bea90160e408f6c67fec63f68d242f50ed53fad8"} Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.835845 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-m6pnj" podStartSLOduration=3.48103391 podStartE2EDuration="5.835826326s" podCreationTimestamp="2025-10-01 10:27:44 +0000 UTC" firstStartedPulling="2025-10-01 10:27:45.453390687 +0000 UTC m=+624.146211959" lastFinishedPulling="2025-10-01 10:27:47.808183113 +0000 UTC m=+626.501004375" observedRunningTime="2025-10-01 10:27:48.928739842 +0000 UTC m=+627.621561124" watchObservedRunningTime="2025-10-01 10:27:49.835826326 +0000 UTC m=+628.528647588" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.836511 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-mpptn"] Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.837576 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-mpptn" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.839261 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-fxrvm" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.840477 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7"] Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.841223 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.843887 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.847965 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-mpptn"] Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.859612 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-p4k5f"] Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.860281 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.865232 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7"] Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.954032 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6"] Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.954835 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.960790 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.960805 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.960847 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mn2wn" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.963427 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9lwj\" (UniqueName: \"kubernetes.io/projected/f37e734e-18a9-4b41-b06d-1da35b2d5654-kube-api-access-w9lwj\") pod \"nmstate-webhook-6d689559c5-xsrh7\" (UID: \"f37e734e-18a9-4b41-b06d-1da35b2d5654\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.963481 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b82af7f6-7fb7-4e7a-9787-1f3b84969763-nmstate-lock\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.963617 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f37e734e-18a9-4b41-b06d-1da35b2d5654-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-xsrh7\" (UID: \"f37e734e-18a9-4b41-b06d-1da35b2d5654\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.963774 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b82af7f6-7fb7-4e7a-9787-1f3b84969763-dbus-socket\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.963810 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqh44\" (UniqueName: \"kubernetes.io/projected/b82af7f6-7fb7-4e7a-9787-1f3b84969763-kube-api-access-dqh44\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.963892 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cvd9\" (UniqueName: \"kubernetes.io/projected/6f89445f-bbda-4a4e-8cc5-ceb03718ffed-kube-api-access-5cvd9\") pod \"nmstate-metrics-58fcddf996-mpptn\" (UID: \"6f89445f-bbda-4a4e-8cc5-ceb03718ffed\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-mpptn" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.963971 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b82af7f6-7fb7-4e7a-9787-1f3b84969763-ovs-socket\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:49 crc kubenswrapper[4735]: I1001 10:27:49.974530 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6"] Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.065739 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9lwj\" (UniqueName: \"kubernetes.io/projected/f37e734e-18a9-4b41-b06d-1da35b2d5654-kube-api-access-w9lwj\") pod \"nmstate-webhook-6d689559c5-xsrh7\" (UID: \"f37e734e-18a9-4b41-b06d-1da35b2d5654\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.066110 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b82af7f6-7fb7-4e7a-9787-1f3b84969763-nmstate-lock\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.066165 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f37e734e-18a9-4b41-b06d-1da35b2d5654-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-xsrh7\" (UID: \"f37e734e-18a9-4b41-b06d-1da35b2d5654\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.066245 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b82af7f6-7fb7-4e7a-9787-1f3b84969763-dbus-socket\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.066272 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cb894fbc-36ef-4c41-ae21-dff369c41c99-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-svjp6\" (UID: \"cb894fbc-36ef-4c41-ae21-dff369c41c99\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.066298 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqh44\" (UniqueName: \"kubernetes.io/projected/b82af7f6-7fb7-4e7a-9787-1f3b84969763-kube-api-access-dqh44\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.066322 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb894fbc-36ef-4c41-ae21-dff369c41c99-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-svjp6\" (UID: \"cb894fbc-36ef-4c41-ae21-dff369c41c99\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.066371 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cvd9\" (UniqueName: \"kubernetes.io/projected/6f89445f-bbda-4a4e-8cc5-ceb03718ffed-kube-api-access-5cvd9\") pod \"nmstate-metrics-58fcddf996-mpptn\" (UID: \"6f89445f-bbda-4a4e-8cc5-ceb03718ffed\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-mpptn" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.066400 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srjlp\" (UniqueName: \"kubernetes.io/projected/cb894fbc-36ef-4c41-ae21-dff369c41c99-kube-api-access-srjlp\") pod \"nmstate-console-plugin-864bb6dfb5-svjp6\" (UID: \"cb894fbc-36ef-4c41-ae21-dff369c41c99\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.066450 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b82af7f6-7fb7-4e7a-9787-1f3b84969763-ovs-socket\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.066540 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b82af7f6-7fb7-4e7a-9787-1f3b84969763-ovs-socket\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.066585 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b82af7f6-7fb7-4e7a-9787-1f3b84969763-nmstate-lock\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.067271 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b82af7f6-7fb7-4e7a-9787-1f3b84969763-dbus-socket\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.074059 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f37e734e-18a9-4b41-b06d-1da35b2d5654-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-xsrh7\" (UID: \"f37e734e-18a9-4b41-b06d-1da35b2d5654\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.085967 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9lwj\" (UniqueName: \"kubernetes.io/projected/f37e734e-18a9-4b41-b06d-1da35b2d5654-kube-api-access-w9lwj\") pod \"nmstate-webhook-6d689559c5-xsrh7\" (UID: \"f37e734e-18a9-4b41-b06d-1da35b2d5654\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.091749 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqh44\" (UniqueName: \"kubernetes.io/projected/b82af7f6-7fb7-4e7a-9787-1f3b84969763-kube-api-access-dqh44\") pod \"nmstate-handler-p4k5f\" (UID: \"b82af7f6-7fb7-4e7a-9787-1f3b84969763\") " pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.093367 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cvd9\" (UniqueName: \"kubernetes.io/projected/6f89445f-bbda-4a4e-8cc5-ceb03718ffed-kube-api-access-5cvd9\") pod \"nmstate-metrics-58fcddf996-mpptn\" (UID: \"6f89445f-bbda-4a4e-8cc5-ceb03718ffed\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-mpptn" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.143300 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bb696995c-s9vxj"] Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.143971 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.154009 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-mpptn" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.155976 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bb696995c-s9vxj"] Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.160637 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.167968 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srjlp\" (UniqueName: \"kubernetes.io/projected/cb894fbc-36ef-4c41-ae21-dff369c41c99-kube-api-access-srjlp\") pod \"nmstate-console-plugin-864bb6dfb5-svjp6\" (UID: \"cb894fbc-36ef-4c41-ae21-dff369c41c99\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.168084 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cb894fbc-36ef-4c41-ae21-dff369c41c99-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-svjp6\" (UID: \"cb894fbc-36ef-4c41-ae21-dff369c41c99\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.168117 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb894fbc-36ef-4c41-ae21-dff369c41c99-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-svjp6\" (UID: \"cb894fbc-36ef-4c41-ae21-dff369c41c99\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.169588 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cb894fbc-36ef-4c41-ae21-dff369c41c99-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-svjp6\" (UID: \"cb894fbc-36ef-4c41-ae21-dff369c41c99\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.171736 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cb894fbc-36ef-4c41-ae21-dff369c41c99-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-svjp6\" (UID: \"cb894fbc-36ef-4c41-ae21-dff369c41c99\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.174159 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.205993 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srjlp\" (UniqueName: \"kubernetes.io/projected/cb894fbc-36ef-4c41-ae21-dff369c41c99-kube-api-access-srjlp\") pod \"nmstate-console-plugin-864bb6dfb5-svjp6\" (UID: \"cb894fbc-36ef-4c41-ae21-dff369c41c99\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.269549 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f91b107-6d35-439f-a736-3b54f965d1c1-console-oauth-config\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.270383 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-console-config\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.270414 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-service-ca\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.270439 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-trusted-ca-bundle\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.270479 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-oauth-serving-cert\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.270550 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdm29\" (UniqueName: \"kubernetes.io/projected/8f91b107-6d35-439f-a736-3b54f965d1c1-kube-api-access-gdm29\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.270594 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f91b107-6d35-439f-a736-3b54f965d1c1-console-serving-cert\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.273904 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.376171 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-console-config\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.376229 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-service-ca\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.376264 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-trusted-ca-bundle\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.376298 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-oauth-serving-cert\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.376360 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdm29\" (UniqueName: \"kubernetes.io/projected/8f91b107-6d35-439f-a736-3b54f965d1c1-kube-api-access-gdm29\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.376403 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f91b107-6d35-439f-a736-3b54f965d1c1-console-serving-cert\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.376441 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f91b107-6d35-439f-a736-3b54f965d1c1-console-oauth-config\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.378043 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-trusted-ca-bundle\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.378064 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-oauth-serving-cert\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.378589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-console-config\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.378625 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f91b107-6d35-439f-a736-3b54f965d1c1-service-ca\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.383403 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f91b107-6d35-439f-a736-3b54f965d1c1-console-serving-cert\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.384892 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f91b107-6d35-439f-a736-3b54f965d1c1-console-oauth-config\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.397130 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdm29\" (UniqueName: \"kubernetes.io/projected/8f91b107-6d35-439f-a736-3b54f965d1c1-kube-api-access-gdm29\") pod \"console-7bb696995c-s9vxj\" (UID: \"8f91b107-6d35-439f-a736-3b54f965d1c1\") " pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.435463 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7"] Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.481074 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6"] Oct 01 10:27:50 crc kubenswrapper[4735]: W1001 10:27:50.487977 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb894fbc_36ef_4c41_ae21_dff369c41c99.slice/crio-de24bb2e79ca539b1c4b5c279b5887a93e855bdc6c07489f01f322c42d7d6c26 WatchSource:0}: Error finding container de24bb2e79ca539b1c4b5c279b5887a93e855bdc6c07489f01f322c42d7d6c26: Status 404 returned error can't find the container with id de24bb2e79ca539b1c4b5c279b5887a93e855bdc6c07489f01f322c42d7d6c26 Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.523424 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.598770 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-mpptn"] Oct 01 10:27:50 crc kubenswrapper[4735]: W1001 10:27:50.609984 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f89445f_bbda_4a4e_8cc5_ceb03718ffed.slice/crio-6dde75973b80a0bc8530301cc1e08f21feaa71e6087953f6bb6d710c4461c3ba WatchSource:0}: Error finding container 6dde75973b80a0bc8530301cc1e08f21feaa71e6087953f6bb6d710c4461c3ba: Status 404 returned error can't find the container with id 6dde75973b80a0bc8530301cc1e08f21feaa71e6087953f6bb6d710c4461c3ba Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.913749 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bb696995c-s9vxj"] Oct 01 10:27:50 crc kubenswrapper[4735]: W1001 10:27:50.918764 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f91b107_6d35_439f_a736_3b54f965d1c1.slice/crio-69b16ee339a069d2ade5db051d4e71d7f409117cd0e198cccbd0ed57c63679c1 WatchSource:0}: Error finding container 69b16ee339a069d2ade5db051d4e71d7f409117cd0e198cccbd0ed57c63679c1: Status 404 returned error can't find the container with id 69b16ee339a069d2ade5db051d4e71d7f409117cd0e198cccbd0ed57c63679c1 Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.920467 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-p4k5f" event={"ID":"b82af7f6-7fb7-4e7a-9787-1f3b84969763","Type":"ContainerStarted","Data":"f480756a5c973500ec05930f4763b9f3d46f45060c87a7d6ebc63744f769e634"} Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.921943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" event={"ID":"f37e734e-18a9-4b41-b06d-1da35b2d5654","Type":"ContainerStarted","Data":"b85347aab14a25d8c5cb21240a200d62ddc7c80d82e8364a722ea478801dec63"} Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.922939 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" event={"ID":"cb894fbc-36ef-4c41-ae21-dff369c41c99","Type":"ContainerStarted","Data":"de24bb2e79ca539b1c4b5c279b5887a93e855bdc6c07489f01f322c42d7d6c26"} Oct 01 10:27:50 crc kubenswrapper[4735]: I1001 10:27:50.924040 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-mpptn" event={"ID":"6f89445f-bbda-4a4e-8cc5-ceb03718ffed","Type":"ContainerStarted","Data":"6dde75973b80a0bc8530301cc1e08f21feaa71e6087953f6bb6d710c4461c3ba"} Oct 01 10:27:51 crc kubenswrapper[4735]: I1001 10:27:51.939858 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb696995c-s9vxj" event={"ID":"8f91b107-6d35-439f-a736-3b54f965d1c1","Type":"ContainerStarted","Data":"f8b120e152d7064e95eaa620274adca584306418a7e5def0cfc5983403f23af0"} Oct 01 10:27:51 crc kubenswrapper[4735]: I1001 10:27:51.940184 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb696995c-s9vxj" event={"ID":"8f91b107-6d35-439f-a736-3b54f965d1c1","Type":"ContainerStarted","Data":"69b16ee339a069d2ade5db051d4e71d7f409117cd0e198cccbd0ed57c63679c1"} Oct 01 10:27:51 crc kubenswrapper[4735]: I1001 10:27:51.987598 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bb696995c-s9vxj" podStartSLOduration=1.9875792140000001 podStartE2EDuration="1.987579214s" podCreationTimestamp="2025-10-01 10:27:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:27:51.985252615 +0000 UTC m=+630.678073877" watchObservedRunningTime="2025-10-01 10:27:51.987579214 +0000 UTC m=+630.680400476" Oct 01 10:27:53 crc kubenswrapper[4735]: I1001 10:27:53.951571 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" event={"ID":"cb894fbc-36ef-4c41-ae21-dff369c41c99","Type":"ContainerStarted","Data":"a0c5c3896edac191225354f20626986eb690ef348762cf8cdbcc7645cc0878c3"} Oct 01 10:27:53 crc kubenswrapper[4735]: I1001 10:27:53.953237 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-mpptn" event={"ID":"6f89445f-bbda-4a4e-8cc5-ceb03718ffed","Type":"ContainerStarted","Data":"8eddcab7b55d31b7eee2f4ad11191c016a00c0d7c02d77a7d0222401ab4759d8"} Oct 01 10:27:53 crc kubenswrapper[4735]: I1001 10:27:53.955249 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-p4k5f" event={"ID":"b82af7f6-7fb7-4e7a-9787-1f3b84969763","Type":"ContainerStarted","Data":"6c5a3a08c3d2f4c64e84fd97fb1e5008b643ee7d7ad6e1c78f9af9bd6e947aa4"} Oct 01 10:27:53 crc kubenswrapper[4735]: I1001 10:27:53.955900 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:27:53 crc kubenswrapper[4735]: I1001 10:27:53.957355 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" event={"ID":"f37e734e-18a9-4b41-b06d-1da35b2d5654","Type":"ContainerStarted","Data":"0c4edc2281701f6a82da17faf3bd60abcb30d11274813058aecf1f814897e204"} Oct 01 10:27:53 crc kubenswrapper[4735]: I1001 10:27:53.957692 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" Oct 01 10:27:53 crc kubenswrapper[4735]: I1001 10:27:53.965639 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-svjp6" podStartSLOduration=2.419436083 podStartE2EDuration="4.965623491s" podCreationTimestamp="2025-10-01 10:27:49 +0000 UTC" firstStartedPulling="2025-10-01 10:27:50.49039979 +0000 UTC m=+629.183221052" lastFinishedPulling="2025-10-01 10:27:53.036587198 +0000 UTC m=+631.729408460" observedRunningTime="2025-10-01 10:27:53.964552989 +0000 UTC m=+632.657374261" watchObservedRunningTime="2025-10-01 10:27:53.965623491 +0000 UTC m=+632.658444753" Oct 01 10:27:53 crc kubenswrapper[4735]: I1001 10:27:53.984684 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" podStartSLOduration=2.393673021 podStartE2EDuration="4.984664432s" podCreationTimestamp="2025-10-01 10:27:49 +0000 UTC" firstStartedPulling="2025-10-01 10:27:50.446461283 +0000 UTC m=+629.139282545" lastFinishedPulling="2025-10-01 10:27:53.037452694 +0000 UTC m=+631.730273956" observedRunningTime="2025-10-01 10:27:53.981845649 +0000 UTC m=+632.674666921" watchObservedRunningTime="2025-10-01 10:27:53.984664432 +0000 UTC m=+632.677485714" Oct 01 10:27:54 crc kubenswrapper[4735]: I1001 10:27:54.003147 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-p4k5f" podStartSLOduration=2.172452516 podStartE2EDuration="5.003132978s" podCreationTimestamp="2025-10-01 10:27:49 +0000 UTC" firstStartedPulling="2025-10-01 10:27:50.229787572 +0000 UTC m=+628.922608834" lastFinishedPulling="2025-10-01 10:27:53.060468044 +0000 UTC m=+631.753289296" observedRunningTime="2025-10-01 10:27:53.999281525 +0000 UTC m=+632.692102787" watchObservedRunningTime="2025-10-01 10:27:54.003132978 +0000 UTC m=+632.695954240" Oct 01 10:27:55 crc kubenswrapper[4735]: I1001 10:27:55.974559 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-mpptn" event={"ID":"6f89445f-bbda-4a4e-8cc5-ceb03718ffed","Type":"ContainerStarted","Data":"e1b36e4cf78070e0bae14ef5e409af2ccbd73e38b354f88d97d49e95c4802d6b"} Oct 01 10:27:56 crc kubenswrapper[4735]: I1001 10:27:56.007989 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-mpptn" podStartSLOduration=2.338666236 podStartE2EDuration="7.007959966s" podCreationTimestamp="2025-10-01 10:27:49 +0000 UTC" firstStartedPulling="2025-10-01 10:27:50.612652662 +0000 UTC m=+629.305473924" lastFinishedPulling="2025-10-01 10:27:55.281946402 +0000 UTC m=+633.974767654" observedRunningTime="2025-10-01 10:27:56.001822215 +0000 UTC m=+634.694643527" watchObservedRunningTime="2025-10-01 10:27:56.007959966 +0000 UTC m=+634.700781258" Oct 01 10:28:00 crc kubenswrapper[4735]: I1001 10:28:00.207451 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-p4k5f" Oct 01 10:28:00 crc kubenswrapper[4735]: I1001 10:28:00.523619 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:28:00 crc kubenswrapper[4735]: I1001 10:28:00.524398 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:28:00 crc kubenswrapper[4735]: I1001 10:28:00.531205 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:28:01 crc kubenswrapper[4735]: I1001 10:28:01.018051 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bb696995c-s9vxj" Oct 01 10:28:01 crc kubenswrapper[4735]: I1001 10:28:01.079304 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xtfsg"] Oct 01 10:28:10 crc kubenswrapper[4735]: I1001 10:28:10.168297 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-xsrh7" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.173630 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2"] Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.175526 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.177403 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.183449 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2"] Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.311106 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.311225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hpkq\" (UniqueName: \"kubernetes.io/projected/e8004806-b53f-47ad-928b-1843522489ea-kube-api-access-4hpkq\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.311260 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.412793 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hpkq\" (UniqueName: \"kubernetes.io/projected/e8004806-b53f-47ad-928b-1843522489ea-kube-api-access-4hpkq\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.412889 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.412951 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.414048 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.414169 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.445899 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hpkq\" (UniqueName: \"kubernetes.io/projected/e8004806-b53f-47ad-928b-1843522489ea-kube-api-access-4hpkq\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.493005 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:23 crc kubenswrapper[4735]: I1001 10:28:23.933103 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2"] Oct 01 10:28:23 crc kubenswrapper[4735]: W1001 10:28:23.940770 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8004806_b53f_47ad_928b_1843522489ea.slice/crio-1e472b113fc3fe99f09d0a9a2cf21c7a966363603472deee8f1954a38b230d5b WatchSource:0}: Error finding container 1e472b113fc3fe99f09d0a9a2cf21c7a966363603472deee8f1954a38b230d5b: Status 404 returned error can't find the container with id 1e472b113fc3fe99f09d0a9a2cf21c7a966363603472deee8f1954a38b230d5b Oct 01 10:28:24 crc kubenswrapper[4735]: I1001 10:28:24.162201 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" event={"ID":"e8004806-b53f-47ad-928b-1843522489ea","Type":"ContainerStarted","Data":"04ab09784458a09bb7cf43c080a09f62d02b944bac8a7f6c294f20bf6b879a03"} Oct 01 10:28:24 crc kubenswrapper[4735]: I1001 10:28:24.162281 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" event={"ID":"e8004806-b53f-47ad-928b-1843522489ea","Type":"ContainerStarted","Data":"1e472b113fc3fe99f09d0a9a2cf21c7a966363603472deee8f1954a38b230d5b"} Oct 01 10:28:25 crc kubenswrapper[4735]: I1001 10:28:25.169401 4735 generic.go:334] "Generic (PLEG): container finished" podID="e8004806-b53f-47ad-928b-1843522489ea" containerID="04ab09784458a09bb7cf43c080a09f62d02b944bac8a7f6c294f20bf6b879a03" exitCode=0 Oct 01 10:28:25 crc kubenswrapper[4735]: I1001 10:28:25.169525 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" event={"ID":"e8004806-b53f-47ad-928b-1843522489ea","Type":"ContainerDied","Data":"04ab09784458a09bb7cf43c080a09f62d02b944bac8a7f6c294f20bf6b879a03"} Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.140998 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-xtfsg" podUID="5e28238a-9bf2-4e10-827d-7350e0ec0150" containerName="console" containerID="cri-o://ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8" gracePeriod=15 Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.177593 4735 generic.go:334] "Generic (PLEG): container finished" podID="e8004806-b53f-47ad-928b-1843522489ea" containerID="6515ded5234ac15d0a33859455e0e7686a53738cbda7482ee67e2c95bbf756be" exitCode=0 Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.177662 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" event={"ID":"e8004806-b53f-47ad-928b-1843522489ea","Type":"ContainerDied","Data":"6515ded5234ac15d0a33859455e0e7686a53738cbda7482ee67e2c95bbf756be"} Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.562257 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xtfsg_5e28238a-9bf2-4e10-827d-7350e0ec0150/console/0.log" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.562343 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.664090 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-config\") pod \"5e28238a-9bf2-4e10-827d-7350e0ec0150\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.664179 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-oauth-serving-cert\") pod \"5e28238a-9bf2-4e10-827d-7350e0ec0150\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.664239 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-service-ca\") pod \"5e28238a-9bf2-4e10-827d-7350e0ec0150\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.664300 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-serving-cert\") pod \"5e28238a-9bf2-4e10-827d-7350e0ec0150\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.664392 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tnzr\" (UniqueName: \"kubernetes.io/projected/5e28238a-9bf2-4e10-827d-7350e0ec0150-kube-api-access-6tnzr\") pod \"5e28238a-9bf2-4e10-827d-7350e0ec0150\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.664446 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-oauth-config\") pod \"5e28238a-9bf2-4e10-827d-7350e0ec0150\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.664525 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-trusted-ca-bundle\") pod \"5e28238a-9bf2-4e10-827d-7350e0ec0150\" (UID: \"5e28238a-9bf2-4e10-827d-7350e0ec0150\") " Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.665171 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-service-ca" (OuterVolumeSpecName: "service-ca") pod "5e28238a-9bf2-4e10-827d-7350e0ec0150" (UID: "5e28238a-9bf2-4e10-827d-7350e0ec0150"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.665194 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5e28238a-9bf2-4e10-827d-7350e0ec0150" (UID: "5e28238a-9bf2-4e10-827d-7350e0ec0150"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.665220 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5e28238a-9bf2-4e10-827d-7350e0ec0150" (UID: "5e28238a-9bf2-4e10-827d-7350e0ec0150"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.665285 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-config" (OuterVolumeSpecName: "console-config") pod "5e28238a-9bf2-4e10-827d-7350e0ec0150" (UID: "5e28238a-9bf2-4e10-827d-7350e0ec0150"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.671014 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5e28238a-9bf2-4e10-827d-7350e0ec0150" (UID: "5e28238a-9bf2-4e10-827d-7350e0ec0150"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.671790 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e28238a-9bf2-4e10-827d-7350e0ec0150-kube-api-access-6tnzr" (OuterVolumeSpecName: "kube-api-access-6tnzr") pod "5e28238a-9bf2-4e10-827d-7350e0ec0150" (UID: "5e28238a-9bf2-4e10-827d-7350e0ec0150"). InnerVolumeSpecName "kube-api-access-6tnzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.672678 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5e28238a-9bf2-4e10-827d-7350e0ec0150" (UID: "5e28238a-9bf2-4e10-827d-7350e0ec0150"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.765935 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.766280 4735 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.766295 4735 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.766307 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5e28238a-9bf2-4e10-827d-7350e0ec0150-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.766318 4735 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.766332 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tnzr\" (UniqueName: \"kubernetes.io/projected/5e28238a-9bf2-4e10-827d-7350e0ec0150-kube-api-access-6tnzr\") on node \"crc\" DevicePath \"\"" Oct 01 10:28:26 crc kubenswrapper[4735]: I1001 10:28:26.766345 4735 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5e28238a-9bf2-4e10-827d-7350e0ec0150-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.185587 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xtfsg_5e28238a-9bf2-4e10-827d-7350e0ec0150/console/0.log" Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.185643 4735 generic.go:334] "Generic (PLEG): container finished" podID="5e28238a-9bf2-4e10-827d-7350e0ec0150" containerID="ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8" exitCode=2 Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.185716 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xtfsg" event={"ID":"5e28238a-9bf2-4e10-827d-7350e0ec0150","Type":"ContainerDied","Data":"ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8"} Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.185748 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xtfsg" Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.185771 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xtfsg" event={"ID":"5e28238a-9bf2-4e10-827d-7350e0ec0150","Type":"ContainerDied","Data":"d575abb946245bd204701aa04c8d83889b916e25ea86e731d250a799a4af2dd9"} Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.185793 4735 scope.go:117] "RemoveContainer" containerID="ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8" Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.188925 4735 generic.go:334] "Generic (PLEG): container finished" podID="e8004806-b53f-47ad-928b-1843522489ea" containerID="e053a872c1eede5d11d17b83da415e9e1035514fce27656bb0aa875a5dcc64d8" exitCode=0 Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.188961 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" event={"ID":"e8004806-b53f-47ad-928b-1843522489ea","Type":"ContainerDied","Data":"e053a872c1eede5d11d17b83da415e9e1035514fce27656bb0aa875a5dcc64d8"} Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.201810 4735 scope.go:117] "RemoveContainer" containerID="ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8" Oct 01 10:28:27 crc kubenswrapper[4735]: E1001 10:28:27.202556 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8\": container with ID starting with ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8 not found: ID does not exist" containerID="ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8" Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.202613 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8"} err="failed to get container status \"ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8\": rpc error: code = NotFound desc = could not find container \"ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8\": container with ID starting with ec4183f8ea6399bd1c4a90b48a1fbe5d44ce980275e436752305478a24a805d8 not found: ID does not exist" Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.223056 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xtfsg"] Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.226525 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-xtfsg"] Oct 01 10:28:27 crc kubenswrapper[4735]: I1001 10:28:27.903991 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e28238a-9bf2-4e10-827d-7350e0ec0150" path="/var/lib/kubelet/pods/5e28238a-9bf2-4e10-827d-7350e0ec0150/volumes" Oct 01 10:28:28 crc kubenswrapper[4735]: I1001 10:28:28.479407 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:28 crc kubenswrapper[4735]: I1001 10:28:28.591706 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-bundle\") pod \"e8004806-b53f-47ad-928b-1843522489ea\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " Oct 01 10:28:28 crc kubenswrapper[4735]: I1001 10:28:28.591782 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-util\") pod \"e8004806-b53f-47ad-928b-1843522489ea\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " Oct 01 10:28:28 crc kubenswrapper[4735]: I1001 10:28:28.591814 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hpkq\" (UniqueName: \"kubernetes.io/projected/e8004806-b53f-47ad-928b-1843522489ea-kube-api-access-4hpkq\") pod \"e8004806-b53f-47ad-928b-1843522489ea\" (UID: \"e8004806-b53f-47ad-928b-1843522489ea\") " Oct 01 10:28:28 crc kubenswrapper[4735]: I1001 10:28:28.592878 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-bundle" (OuterVolumeSpecName: "bundle") pod "e8004806-b53f-47ad-928b-1843522489ea" (UID: "e8004806-b53f-47ad-928b-1843522489ea"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:28:28 crc kubenswrapper[4735]: I1001 10:28:28.599708 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8004806-b53f-47ad-928b-1843522489ea-kube-api-access-4hpkq" (OuterVolumeSpecName: "kube-api-access-4hpkq") pod "e8004806-b53f-47ad-928b-1843522489ea" (UID: "e8004806-b53f-47ad-928b-1843522489ea"). InnerVolumeSpecName "kube-api-access-4hpkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:28:28 crc kubenswrapper[4735]: I1001 10:28:28.609676 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-util" (OuterVolumeSpecName: "util") pod "e8004806-b53f-47ad-928b-1843522489ea" (UID: "e8004806-b53f-47ad-928b-1843522489ea"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:28:28 crc kubenswrapper[4735]: I1001 10:28:28.693576 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:28:28 crc kubenswrapper[4735]: I1001 10:28:28.693615 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8004806-b53f-47ad-928b-1843522489ea-util\") on node \"crc\" DevicePath \"\"" Oct 01 10:28:28 crc kubenswrapper[4735]: I1001 10:28:28.693627 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hpkq\" (UniqueName: \"kubernetes.io/projected/e8004806-b53f-47ad-928b-1843522489ea-kube-api-access-4hpkq\") on node \"crc\" DevicePath \"\"" Oct 01 10:28:29 crc kubenswrapper[4735]: I1001 10:28:29.210593 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" event={"ID":"e8004806-b53f-47ad-928b-1843522489ea","Type":"ContainerDied","Data":"1e472b113fc3fe99f09d0a9a2cf21c7a966363603472deee8f1954a38b230d5b"} Oct 01 10:28:29 crc kubenswrapper[4735]: I1001 10:28:29.210659 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e472b113fc3fe99f09d0a9a2cf21c7a966363603472deee8f1954a38b230d5b" Oct 01 10:28:29 crc kubenswrapper[4735]: I1001 10:28:29.210681 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.096840 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq"] Oct 01 10:28:38 crc kubenswrapper[4735]: E1001 10:28:38.097675 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e28238a-9bf2-4e10-827d-7350e0ec0150" containerName="console" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.097690 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e28238a-9bf2-4e10-827d-7350e0ec0150" containerName="console" Oct 01 10:28:38 crc kubenswrapper[4735]: E1001 10:28:38.097705 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8004806-b53f-47ad-928b-1843522489ea" containerName="util" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.097711 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8004806-b53f-47ad-928b-1843522489ea" containerName="util" Oct 01 10:28:38 crc kubenswrapper[4735]: E1001 10:28:38.097728 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8004806-b53f-47ad-928b-1843522489ea" containerName="pull" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.097736 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8004806-b53f-47ad-928b-1843522489ea" containerName="pull" Oct 01 10:28:38 crc kubenswrapper[4735]: E1001 10:28:38.097751 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8004806-b53f-47ad-928b-1843522489ea" containerName="extract" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.097757 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8004806-b53f-47ad-928b-1843522489ea" containerName="extract" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.097897 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e28238a-9bf2-4e10-827d-7350e0ec0150" containerName="console" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.097912 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8004806-b53f-47ad-928b-1843522489ea" containerName="extract" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.098399 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.106964 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.107093 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.107198 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.107946 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-bdvpt" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.114929 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.129138 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq"] Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.219397 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9-webhook-cert\") pod \"metallb-operator-controller-manager-755f8bc9ff-4w5jq\" (UID: \"a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9\") " pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.219586 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nqlv\" (UniqueName: \"kubernetes.io/projected/a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9-kube-api-access-6nqlv\") pod \"metallb-operator-controller-manager-755f8bc9ff-4w5jq\" (UID: \"a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9\") " pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.219627 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9-apiservice-cert\") pod \"metallb-operator-controller-manager-755f8bc9ff-4w5jq\" (UID: \"a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9\") " pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.320877 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nqlv\" (UniqueName: \"kubernetes.io/projected/a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9-kube-api-access-6nqlv\") pod \"metallb-operator-controller-manager-755f8bc9ff-4w5jq\" (UID: \"a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9\") " pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.320931 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9-apiservice-cert\") pod \"metallb-operator-controller-manager-755f8bc9ff-4w5jq\" (UID: \"a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9\") " pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.320983 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9-webhook-cert\") pod \"metallb-operator-controller-manager-755f8bc9ff-4w5jq\" (UID: \"a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9\") " pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.326334 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9-apiservice-cert\") pod \"metallb-operator-controller-manager-755f8bc9ff-4w5jq\" (UID: \"a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9\") " pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.326961 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9-webhook-cert\") pod \"metallb-operator-controller-manager-755f8bc9ff-4w5jq\" (UID: \"a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9\") " pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.336621 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nqlv\" (UniqueName: \"kubernetes.io/projected/a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9-kube-api-access-6nqlv\") pod \"metallb-operator-controller-manager-755f8bc9ff-4w5jq\" (UID: \"a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9\") " pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.417545 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b"] Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.418248 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.419878 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.420989 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.422027 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.426217 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-56zr2" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.437576 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b"] Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.525311 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db9781ea-0490-415d-8e5b-7b64d4aa62dd-webhook-cert\") pod \"metallb-operator-webhook-server-5455ff795f-xpx6b\" (UID: \"db9781ea-0490-415d-8e5b-7b64d4aa62dd\") " pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.525763 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db9781ea-0490-415d-8e5b-7b64d4aa62dd-apiservice-cert\") pod \"metallb-operator-webhook-server-5455ff795f-xpx6b\" (UID: \"db9781ea-0490-415d-8e5b-7b64d4aa62dd\") " pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.525833 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz4r5\" (UniqueName: \"kubernetes.io/projected/db9781ea-0490-415d-8e5b-7b64d4aa62dd-kube-api-access-hz4r5\") pod \"metallb-operator-webhook-server-5455ff795f-xpx6b\" (UID: \"db9781ea-0490-415d-8e5b-7b64d4aa62dd\") " pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.626592 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4r5\" (UniqueName: \"kubernetes.io/projected/db9781ea-0490-415d-8e5b-7b64d4aa62dd-kube-api-access-hz4r5\") pod \"metallb-operator-webhook-server-5455ff795f-xpx6b\" (UID: \"db9781ea-0490-415d-8e5b-7b64d4aa62dd\") " pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.626688 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db9781ea-0490-415d-8e5b-7b64d4aa62dd-webhook-cert\") pod \"metallb-operator-webhook-server-5455ff795f-xpx6b\" (UID: \"db9781ea-0490-415d-8e5b-7b64d4aa62dd\") " pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.626719 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db9781ea-0490-415d-8e5b-7b64d4aa62dd-apiservice-cert\") pod \"metallb-operator-webhook-server-5455ff795f-xpx6b\" (UID: \"db9781ea-0490-415d-8e5b-7b64d4aa62dd\") " pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.630889 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db9781ea-0490-415d-8e5b-7b64d4aa62dd-apiservice-cert\") pod \"metallb-operator-webhook-server-5455ff795f-xpx6b\" (UID: \"db9781ea-0490-415d-8e5b-7b64d4aa62dd\") " pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.640132 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db9781ea-0490-415d-8e5b-7b64d4aa62dd-webhook-cert\") pod \"metallb-operator-webhook-server-5455ff795f-xpx6b\" (UID: \"db9781ea-0490-415d-8e5b-7b64d4aa62dd\") " pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.641004 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq"] Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.642923 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz4r5\" (UniqueName: \"kubernetes.io/projected/db9781ea-0490-415d-8e5b-7b64d4aa62dd-kube-api-access-hz4r5\") pod \"metallb-operator-webhook-server-5455ff795f-xpx6b\" (UID: \"db9781ea-0490-415d-8e5b-7b64d4aa62dd\") " pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:38 crc kubenswrapper[4735]: W1001 10:28:38.645253 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a673c2_160f_4f1a_8bcf_fbc1e3692cb9.slice/crio-a85cf723a3e0f1d0a12531660dcb47f30f128464c67c1d2b3a557bac90c5e0b4 WatchSource:0}: Error finding container a85cf723a3e0f1d0a12531660dcb47f30f128464c67c1d2b3a557bac90c5e0b4: Status 404 returned error can't find the container with id a85cf723a3e0f1d0a12531660dcb47f30f128464c67c1d2b3a557bac90c5e0b4 Oct 01 10:28:38 crc kubenswrapper[4735]: I1001 10:28:38.732200 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:39 crc kubenswrapper[4735]: I1001 10:28:39.202329 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b"] Oct 01 10:28:39 crc kubenswrapper[4735]: W1001 10:28:39.206883 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb9781ea_0490_415d_8e5b_7b64d4aa62dd.slice/crio-6cbe02384662ce7034958590d77c7df3c197593ab666a4cbe475468dd024d17f WatchSource:0}: Error finding container 6cbe02384662ce7034958590d77c7df3c197593ab666a4cbe475468dd024d17f: Status 404 returned error can't find the container with id 6cbe02384662ce7034958590d77c7df3c197593ab666a4cbe475468dd024d17f Oct 01 10:28:39 crc kubenswrapper[4735]: I1001 10:28:39.272461 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" event={"ID":"a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9","Type":"ContainerStarted","Data":"a85cf723a3e0f1d0a12531660dcb47f30f128464c67c1d2b3a557bac90c5e0b4"} Oct 01 10:28:39 crc kubenswrapper[4735]: I1001 10:28:39.273370 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" event={"ID":"db9781ea-0490-415d-8e5b-7b64d4aa62dd","Type":"ContainerStarted","Data":"6cbe02384662ce7034958590d77c7df3c197593ab666a4cbe475468dd024d17f"} Oct 01 10:28:44 crc kubenswrapper[4735]: I1001 10:28:44.303359 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" event={"ID":"db9781ea-0490-415d-8e5b-7b64d4aa62dd","Type":"ContainerStarted","Data":"da4a52617fe8724f3e43755d3ca071529f712cac2c4589695a79de3849872b11"} Oct 01 10:28:44 crc kubenswrapper[4735]: I1001 10:28:44.304034 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:28:44 crc kubenswrapper[4735]: I1001 10:28:44.306138 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" event={"ID":"a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9","Type":"ContainerStarted","Data":"00cafa1bd390d08171d47129ee78511cb989403ebb24704e718d82a0257236f8"} Oct 01 10:28:44 crc kubenswrapper[4735]: I1001 10:28:44.306632 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:28:44 crc kubenswrapper[4735]: I1001 10:28:44.328506 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" podStartSLOduration=2.320753927 podStartE2EDuration="6.328482111s" podCreationTimestamp="2025-10-01 10:28:38 +0000 UTC" firstStartedPulling="2025-10-01 10:28:39.209798166 +0000 UTC m=+677.902619428" lastFinishedPulling="2025-10-01 10:28:43.21752635 +0000 UTC m=+681.910347612" observedRunningTime="2025-10-01 10:28:44.325168604 +0000 UTC m=+683.017989866" watchObservedRunningTime="2025-10-01 10:28:44.328482111 +0000 UTC m=+683.021303373" Oct 01 10:28:44 crc kubenswrapper[4735]: I1001 10:28:44.356280 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" podStartSLOduration=1.801522051 podStartE2EDuration="6.356258452s" podCreationTimestamp="2025-10-01 10:28:38 +0000 UTC" firstStartedPulling="2025-10-01 10:28:38.646932231 +0000 UTC m=+677.339753493" lastFinishedPulling="2025-10-01 10:28:43.201668632 +0000 UTC m=+681.894489894" observedRunningTime="2025-10-01 10:28:44.352324728 +0000 UTC m=+683.045146010" watchObservedRunningTime="2025-10-01 10:28:44.356258452 +0000 UTC m=+683.049079714" Oct 01 10:28:58 crc kubenswrapper[4735]: I1001 10:28:58.737868 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5455ff795f-xpx6b" Oct 01 10:29:18 crc kubenswrapper[4735]: I1001 10:29:18.426389 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-755f8bc9ff-4w5jq" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.293720 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-w6xhg"] Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.296646 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.300001 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.300094 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-f2kj9" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.302475 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.311301 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l"] Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.312533 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.314605 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.324735 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l"] Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.385820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chdtc\" (UniqueName: \"kubernetes.io/projected/cfedcee2-b1bb-4709-af40-1d2c309be304-kube-api-access-chdtc\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.385871 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-frr-sockets\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.385900 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfedcee2-b1bb-4709-af40-1d2c309be304-metrics-certs\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.385925 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/cfedcee2-b1bb-4709-af40-1d2c309be304-frr-startup\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.385954 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-reloader\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.386078 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-metrics\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.386221 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-frr-conf\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.395191 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-9c82t"] Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.396113 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9c82t" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.400690 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.403729 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.403838 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.404532 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-sc4dv" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.409019 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-6nnhj"] Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.410030 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.411855 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.415544 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-6nnhj"] Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.487423 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/cfedcee2-b1bb-4709-af40-1d2c309be304-frr-startup\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.487481 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6fg2\" (UniqueName: \"kubernetes.io/projected/d45651e4-469d-458c-9d48-ad996f82c3f0-kube-api-access-h6fg2\") pod \"frr-k8s-webhook-server-5478bdb765-cgg5l\" (UID: \"d45651e4-469d-458c-9d48-ad996f82c3f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.487522 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-reloader\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.487571 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d45651e4-469d-458c-9d48-ad996f82c3f0-cert\") pod \"frr-k8s-webhook-server-5478bdb765-cgg5l\" (UID: \"d45651e4-469d-458c-9d48-ad996f82c3f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.487592 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-metrics\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.487614 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-cert\") pod \"controller-5d688f5ffc-6nnhj\" (UID: \"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9\") " pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.487821 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmqg2\" (UniqueName: \"kubernetes.io/projected/ea017fc5-1856-47da-99a5-c866738be35e-kube-api-access-qmqg2\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.487913 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-metrics-certs\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.487977 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-frr-conf\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.488011 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chdtc\" (UniqueName: \"kubernetes.io/projected/cfedcee2-b1bb-4709-af40-1d2c309be304-kube-api-access-chdtc\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.488029 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-memberlist\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.488055 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-frr-sockets\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.488093 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ea017fc5-1856-47da-99a5-c866738be35e-metallb-excludel2\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.488109 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-metrics-certs\") pod \"controller-5d688f5ffc-6nnhj\" (UID: \"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9\") " pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.488132 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khjc5\" (UniqueName: \"kubernetes.io/projected/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-kube-api-access-khjc5\") pod \"controller-5d688f5ffc-6nnhj\" (UID: \"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9\") " pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.488150 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfedcee2-b1bb-4709-af40-1d2c309be304-metrics-certs\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.488369 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/cfedcee2-b1bb-4709-af40-1d2c309be304-frr-startup\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.488554 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-metrics\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.488617 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-frr-sockets\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.488797 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-frr-conf\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.489213 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/cfedcee2-b1bb-4709-af40-1d2c309be304-reloader\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.493967 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfedcee2-b1bb-4709-af40-1d2c309be304-metrics-certs\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.503649 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chdtc\" (UniqueName: \"kubernetes.io/projected/cfedcee2-b1bb-4709-af40-1d2c309be304-kube-api-access-chdtc\") pod \"frr-k8s-w6xhg\" (UID: \"cfedcee2-b1bb-4709-af40-1d2c309be304\") " pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.589656 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-memberlist\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.589712 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ea017fc5-1856-47da-99a5-c866738be35e-metallb-excludel2\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.589732 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-metrics-certs\") pod \"controller-5d688f5ffc-6nnhj\" (UID: \"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9\") " pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.589750 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khjc5\" (UniqueName: \"kubernetes.io/projected/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-kube-api-access-khjc5\") pod \"controller-5d688f5ffc-6nnhj\" (UID: \"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9\") " pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.589785 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6fg2\" (UniqueName: \"kubernetes.io/projected/d45651e4-469d-458c-9d48-ad996f82c3f0-kube-api-access-h6fg2\") pod \"frr-k8s-webhook-server-5478bdb765-cgg5l\" (UID: \"d45651e4-469d-458c-9d48-ad996f82c3f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.589811 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d45651e4-469d-458c-9d48-ad996f82c3f0-cert\") pod \"frr-k8s-webhook-server-5478bdb765-cgg5l\" (UID: \"d45651e4-469d-458c-9d48-ad996f82c3f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.589832 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-cert\") pod \"controller-5d688f5ffc-6nnhj\" (UID: \"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9\") " pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.589853 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmqg2\" (UniqueName: \"kubernetes.io/projected/ea017fc5-1856-47da-99a5-c866738be35e-kube-api-access-qmqg2\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.589877 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-metrics-certs\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:19 crc kubenswrapper[4735]: E1001 10:29:19.590021 4735 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 01 10:29:19 crc kubenswrapper[4735]: E1001 10:29:19.590073 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-metrics-certs podName:ea017fc5-1856-47da-99a5-c866738be35e nodeName:}" failed. No retries permitted until 2025-10-01 10:29:20.090057661 +0000 UTC m=+718.782878923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-metrics-certs") pod "speaker-9c82t" (UID: "ea017fc5-1856-47da-99a5-c866738be35e") : secret "speaker-certs-secret" not found Oct 01 10:29:19 crc kubenswrapper[4735]: E1001 10:29:19.590336 4735 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 10:29:19 crc kubenswrapper[4735]: E1001 10:29:19.590359 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-memberlist podName:ea017fc5-1856-47da-99a5-c866738be35e nodeName:}" failed. No retries permitted until 2025-10-01 10:29:20.090352549 +0000 UTC m=+718.783173811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-memberlist") pod "speaker-9c82t" (UID: "ea017fc5-1856-47da-99a5-c866738be35e") : secret "metallb-memberlist" not found Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.591396 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ea017fc5-1856-47da-99a5-c866738be35e-metallb-excludel2\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:19 crc kubenswrapper[4735]: E1001 10:29:19.591457 4735 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 01 10:29:19 crc kubenswrapper[4735]: E1001 10:29:19.591483 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-metrics-certs podName:d76a5dcc-a5c5-435c-9dfc-11bab4a422e9 nodeName:}" failed. No retries permitted until 2025-10-01 10:29:20.091475269 +0000 UTC m=+718.784296531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-metrics-certs") pod "controller-5d688f5ffc-6nnhj" (UID: "d76a5dcc-a5c5-435c-9dfc-11bab4a422e9") : secret "controller-certs-secret" not found Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.595322 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d45651e4-469d-458c-9d48-ad996f82c3f0-cert\") pod \"frr-k8s-webhook-server-5478bdb765-cgg5l\" (UID: \"d45651e4-469d-458c-9d48-ad996f82c3f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.600965 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-cert\") pod \"controller-5d688f5ffc-6nnhj\" (UID: \"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9\") " pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.605625 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khjc5\" (UniqueName: \"kubernetes.io/projected/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-kube-api-access-khjc5\") pod \"controller-5d688f5ffc-6nnhj\" (UID: \"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9\") " pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.608230 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmqg2\" (UniqueName: \"kubernetes.io/projected/ea017fc5-1856-47da-99a5-c866738be35e-kube-api-access-qmqg2\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.610610 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6fg2\" (UniqueName: \"kubernetes.io/projected/d45651e4-469d-458c-9d48-ad996f82c3f0-kube-api-access-h6fg2\") pod \"frr-k8s-webhook-server-5478bdb765-cgg5l\" (UID: \"d45651e4-469d-458c-9d48-ad996f82c3f0\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.615892 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:19 crc kubenswrapper[4735]: I1001 10:29:19.626262 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" Oct 01 10:29:20 crc kubenswrapper[4735]: I1001 10:29:20.004526 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l"] Oct 01 10:29:20 crc kubenswrapper[4735]: W1001 10:29:20.008873 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd45651e4_469d_458c_9d48_ad996f82c3f0.slice/crio-043e2120d6d70d1397fb9a589a088fd0c063de816106b34d33532831a87f2ff2 WatchSource:0}: Error finding container 043e2120d6d70d1397fb9a589a088fd0c063de816106b34d33532831a87f2ff2: Status 404 returned error can't find the container with id 043e2120d6d70d1397fb9a589a088fd0c063de816106b34d33532831a87f2ff2 Oct 01 10:29:20 crc kubenswrapper[4735]: I1001 10:29:20.096694 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-metrics-certs\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:20 crc kubenswrapper[4735]: I1001 10:29:20.096772 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-memberlist\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:20 crc kubenswrapper[4735]: I1001 10:29:20.096857 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-metrics-certs\") pod \"controller-5d688f5ffc-6nnhj\" (UID: \"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9\") " pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:20 crc kubenswrapper[4735]: E1001 10:29:20.096959 4735 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 10:29:20 crc kubenswrapper[4735]: E1001 10:29:20.097059 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-memberlist podName:ea017fc5-1856-47da-99a5-c866738be35e nodeName:}" failed. No retries permitted until 2025-10-01 10:29:21.097033367 +0000 UTC m=+719.789854639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-memberlist") pod "speaker-9c82t" (UID: "ea017fc5-1856-47da-99a5-c866738be35e") : secret "metallb-memberlist" not found Oct 01 10:29:20 crc kubenswrapper[4735]: I1001 10:29:20.101831 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d76a5dcc-a5c5-435c-9dfc-11bab4a422e9-metrics-certs\") pod \"controller-5d688f5ffc-6nnhj\" (UID: \"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9\") " pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:20 crc kubenswrapper[4735]: I1001 10:29:20.102047 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-metrics-certs\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:20 crc kubenswrapper[4735]: I1001 10:29:20.341012 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:20 crc kubenswrapper[4735]: I1001 10:29:20.540200 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w6xhg" event={"ID":"cfedcee2-b1bb-4709-af40-1d2c309be304","Type":"ContainerStarted","Data":"fd4614c5260752d308419da8c080f82ea6f2ca8630eb1e271bde582898a71074"} Oct 01 10:29:20 crc kubenswrapper[4735]: I1001 10:29:20.541137 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" event={"ID":"d45651e4-469d-458c-9d48-ad996f82c3f0","Type":"ContainerStarted","Data":"043e2120d6d70d1397fb9a589a088fd0c063de816106b34d33532831a87f2ff2"} Oct 01 10:29:20 crc kubenswrapper[4735]: I1001 10:29:20.742543 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-6nnhj"] Oct 01 10:29:20 crc kubenswrapper[4735]: W1001 10:29:20.753469 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd76a5dcc_a5c5_435c_9dfc_11bab4a422e9.slice/crio-a9bf607912bac9d03885ac6cb146ac768cc403980930320c8d257c1dd230b815 WatchSource:0}: Error finding container a9bf607912bac9d03885ac6cb146ac768cc403980930320c8d257c1dd230b815: Status 404 returned error can't find the container with id a9bf607912bac9d03885ac6cb146ac768cc403980930320c8d257c1dd230b815 Oct 01 10:29:21 crc kubenswrapper[4735]: I1001 10:29:21.111446 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-memberlist\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:21 crc kubenswrapper[4735]: I1001 10:29:21.119185 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ea017fc5-1856-47da-99a5-c866738be35e-memberlist\") pod \"speaker-9c82t\" (UID: \"ea017fc5-1856-47da-99a5-c866738be35e\") " pod="metallb-system/speaker-9c82t" Oct 01 10:29:21 crc kubenswrapper[4735]: I1001 10:29:21.209896 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9c82t" Oct 01 10:29:21 crc kubenswrapper[4735]: W1001 10:29:21.229382 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea017fc5_1856_47da_99a5_c866738be35e.slice/crio-9e80ae55158bfa67e918da706e0d7adb0629ae2fc36fe20856126aeb2e60a6b1 WatchSource:0}: Error finding container 9e80ae55158bfa67e918da706e0d7adb0629ae2fc36fe20856126aeb2e60a6b1: Status 404 returned error can't find the container with id 9e80ae55158bfa67e918da706e0d7adb0629ae2fc36fe20856126aeb2e60a6b1 Oct 01 10:29:21 crc kubenswrapper[4735]: I1001 10:29:21.549996 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9c82t" event={"ID":"ea017fc5-1856-47da-99a5-c866738be35e","Type":"ContainerStarted","Data":"7438d442c55e9fd5ad5127d04b76517f0ea3ff50abd721e3c6d4d012b8724f58"} Oct 01 10:29:21 crc kubenswrapper[4735]: I1001 10:29:21.550239 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9c82t" event={"ID":"ea017fc5-1856-47da-99a5-c866738be35e","Type":"ContainerStarted","Data":"9e80ae55158bfa67e918da706e0d7adb0629ae2fc36fe20856126aeb2e60a6b1"} Oct 01 10:29:21 crc kubenswrapper[4735]: I1001 10:29:21.555363 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-6nnhj" event={"ID":"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9","Type":"ContainerStarted","Data":"16ce3e81839bb6246720e4ec4434c19e62566f214bad8339a6b3a6fb90a9a418"} Oct 01 10:29:21 crc kubenswrapper[4735]: I1001 10:29:21.555402 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-6nnhj" event={"ID":"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9","Type":"ContainerStarted","Data":"21425188a3e9971960058bd3d6140745a717df2ee3625561dbfeceffbcbb9374"} Oct 01 10:29:21 crc kubenswrapper[4735]: I1001 10:29:21.555412 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-6nnhj" event={"ID":"d76a5dcc-a5c5-435c-9dfc-11bab4a422e9","Type":"ContainerStarted","Data":"a9bf607912bac9d03885ac6cb146ac768cc403980930320c8d257c1dd230b815"} Oct 01 10:29:21 crc kubenswrapper[4735]: I1001 10:29:21.555846 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:21 crc kubenswrapper[4735]: I1001 10:29:21.576305 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-6nnhj" podStartSLOduration=2.576282587 podStartE2EDuration="2.576282587s" podCreationTimestamp="2025-10-01 10:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:29:21.569739543 +0000 UTC m=+720.262568765" watchObservedRunningTime="2025-10-01 10:29:21.576282587 +0000 UTC m=+720.269103849" Oct 01 10:29:22 crc kubenswrapper[4735]: I1001 10:29:22.566625 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9c82t" event={"ID":"ea017fc5-1856-47da-99a5-c866738be35e","Type":"ContainerStarted","Data":"913d167a105db9a8230bb111b4238fdc98f2cbbc7536c7a73a4f5bf8c755ca89"} Oct 01 10:29:22 crc kubenswrapper[4735]: I1001 10:29:22.566934 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-9c82t" Oct 01 10:29:22 crc kubenswrapper[4735]: I1001 10:29:22.588065 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-9c82t" podStartSLOduration=3.5880402719999998 podStartE2EDuration="3.588040272s" podCreationTimestamp="2025-10-01 10:29:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:29:22.584918908 +0000 UTC m=+721.277740170" watchObservedRunningTime="2025-10-01 10:29:22.588040272 +0000 UTC m=+721.280861534" Oct 01 10:29:24 crc kubenswrapper[4735]: I1001 10:29:24.062347 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbs79"] Oct 01 10:29:24 crc kubenswrapper[4735]: I1001 10:29:24.062587 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" podUID="bf1e6345-2675-413e-bd53-d456e57b08bd" containerName="controller-manager" containerID="cri-o://1f9e2fa9e342bdbb074acae0e27f096f6a2f5bcd3bae1ed067cf1e4ec58a1b32" gracePeriod=30 Oct 01 10:29:24 crc kubenswrapper[4735]: I1001 10:29:24.184333 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj"] Oct 01 10:29:24 crc kubenswrapper[4735]: I1001 10:29:24.184929 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" podUID="d604b23e-ed8c-486f-b7d5-5a5ddd308947" containerName="route-controller-manager" containerID="cri-o://ed6898fb34b96da340f78c86273757ef5b076af059a3c68cf78af2621a239e21" gracePeriod=30 Oct 01 10:29:24 crc kubenswrapper[4735]: I1001 10:29:24.581453 4735 generic.go:334] "Generic (PLEG): container finished" podID="d604b23e-ed8c-486f-b7d5-5a5ddd308947" containerID="ed6898fb34b96da340f78c86273757ef5b076af059a3c68cf78af2621a239e21" exitCode=0 Oct 01 10:29:24 crc kubenswrapper[4735]: I1001 10:29:24.581546 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" event={"ID":"d604b23e-ed8c-486f-b7d5-5a5ddd308947","Type":"ContainerDied","Data":"ed6898fb34b96da340f78c86273757ef5b076af059a3c68cf78af2621a239e21"} Oct 01 10:29:24 crc kubenswrapper[4735]: I1001 10:29:24.583882 4735 generic.go:334] "Generic (PLEG): container finished" podID="bf1e6345-2675-413e-bd53-d456e57b08bd" containerID="1f9e2fa9e342bdbb074acae0e27f096f6a2f5bcd3bae1ed067cf1e4ec58a1b32" exitCode=0 Oct 01 10:29:24 crc kubenswrapper[4735]: I1001 10:29:24.583941 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" event={"ID":"bf1e6345-2675-413e-bd53-d456e57b08bd","Type":"ContainerDied","Data":"1f9e2fa9e342bdbb074acae0e27f096f6a2f5bcd3bae1ed067cf1e4ec58a1b32"} Oct 01 10:29:26 crc kubenswrapper[4735]: I1001 10:29:26.847242 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:29:26 crc kubenswrapper[4735]: I1001 10:29:26.852224 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.002377 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf1e6345-2675-413e-bd53-d456e57b08bd-serving-cert\") pod \"bf1e6345-2675-413e-bd53-d456e57b08bd\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.002420 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-client-ca\") pod \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.002444 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c76mg\" (UniqueName: \"kubernetes.io/projected/d604b23e-ed8c-486f-b7d5-5a5ddd308947-kube-api-access-c76mg\") pod \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.002465 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-client-ca\") pod \"bf1e6345-2675-413e-bd53-d456e57b08bd\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.002510 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-config\") pod \"bf1e6345-2675-413e-bd53-d456e57b08bd\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.002558 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-config\") pod \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.002588 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d604b23e-ed8c-486f-b7d5-5a5ddd308947-serving-cert\") pod \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\" (UID: \"d604b23e-ed8c-486f-b7d5-5a5ddd308947\") " Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.002602 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2wnd\" (UniqueName: \"kubernetes.io/projected/bf1e6345-2675-413e-bd53-d456e57b08bd-kube-api-access-p2wnd\") pod \"bf1e6345-2675-413e-bd53-d456e57b08bd\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.002680 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-proxy-ca-bundles\") pod \"bf1e6345-2675-413e-bd53-d456e57b08bd\" (UID: \"bf1e6345-2675-413e-bd53-d456e57b08bd\") " Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.003745 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bf1e6345-2675-413e-bd53-d456e57b08bd" (UID: "bf1e6345-2675-413e-bd53-d456e57b08bd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.004408 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-client-ca" (OuterVolumeSpecName: "client-ca") pod "bf1e6345-2675-413e-bd53-d456e57b08bd" (UID: "bf1e6345-2675-413e-bd53-d456e57b08bd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.004426 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-config" (OuterVolumeSpecName: "config") pod "d604b23e-ed8c-486f-b7d5-5a5ddd308947" (UID: "d604b23e-ed8c-486f-b7d5-5a5ddd308947"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.004716 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-config" (OuterVolumeSpecName: "config") pod "bf1e6345-2675-413e-bd53-d456e57b08bd" (UID: "bf1e6345-2675-413e-bd53-d456e57b08bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.004609 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-client-ca" (OuterVolumeSpecName: "client-ca") pod "d604b23e-ed8c-486f-b7d5-5a5ddd308947" (UID: "d604b23e-ed8c-486f-b7d5-5a5ddd308947"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.008989 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1e6345-2675-413e-bd53-d456e57b08bd-kube-api-access-p2wnd" (OuterVolumeSpecName: "kube-api-access-p2wnd") pod "bf1e6345-2675-413e-bd53-d456e57b08bd" (UID: "bf1e6345-2675-413e-bd53-d456e57b08bd"). InnerVolumeSpecName "kube-api-access-p2wnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.009452 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d604b23e-ed8c-486f-b7d5-5a5ddd308947-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d604b23e-ed8c-486f-b7d5-5a5ddd308947" (UID: "d604b23e-ed8c-486f-b7d5-5a5ddd308947"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.009598 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d604b23e-ed8c-486f-b7d5-5a5ddd308947-kube-api-access-c76mg" (OuterVolumeSpecName: "kube-api-access-c76mg") pod "d604b23e-ed8c-486f-b7d5-5a5ddd308947" (UID: "d604b23e-ed8c-486f-b7d5-5a5ddd308947"). InnerVolumeSpecName "kube-api-access-c76mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.010082 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1e6345-2675-413e-bd53-d456e57b08bd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bf1e6345-2675-413e-bd53-d456e57b08bd" (UID: "bf1e6345-2675-413e-bd53-d456e57b08bd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.104466 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.104515 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d604b23e-ed8c-486f-b7d5-5a5ddd308947-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.104525 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2wnd\" (UniqueName: \"kubernetes.io/projected/bf1e6345-2675-413e-bd53-d456e57b08bd-kube-api-access-p2wnd\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.104537 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.104547 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf1e6345-2675-413e-bd53-d456e57b08bd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.104555 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d604b23e-ed8c-486f-b7d5-5a5ddd308947-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.104563 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c76mg\" (UniqueName: \"kubernetes.io/projected/d604b23e-ed8c-486f-b7d5-5a5ddd308947-kube-api-access-c76mg\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.104571 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.104594 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf1e6345-2675-413e-bd53-d456e57b08bd-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.255161 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f848c4c65-7fd8z"] Oct 01 10:29:27 crc kubenswrapper[4735]: E1001 10:29:27.255597 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1e6345-2675-413e-bd53-d456e57b08bd" containerName="controller-manager" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.255624 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1e6345-2675-413e-bd53-d456e57b08bd" containerName="controller-manager" Oct 01 10:29:27 crc kubenswrapper[4735]: E1001 10:29:27.255666 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d604b23e-ed8c-486f-b7d5-5a5ddd308947" containerName="route-controller-manager" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.255678 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d604b23e-ed8c-486f-b7d5-5a5ddd308947" containerName="route-controller-manager" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.255900 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1e6345-2675-413e-bd53-d456e57b08bd" containerName="controller-manager" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.255926 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d604b23e-ed8c-486f-b7d5-5a5ddd308947" containerName="route-controller-manager" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.256626 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.257856 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk"] Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.258597 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.278754 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f848c4c65-7fd8z"] Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.283050 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk"] Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.407411 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c384434d-0798-4423-af64-25727390d372-config\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.407472 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c384434d-0798-4423-af64-25727390d372-client-ca\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.407528 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnptn\" (UniqueName: \"kubernetes.io/projected/aa1c4485-5a7b-409a-9e37-618b66589fb8-kube-api-access-fnptn\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.407712 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa1c4485-5a7b-409a-9e37-618b66589fb8-client-ca\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.407827 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f4jc\" (UniqueName: \"kubernetes.io/projected/c384434d-0798-4423-af64-25727390d372-kube-api-access-4f4jc\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.407879 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c384434d-0798-4423-af64-25727390d372-serving-cert\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.408021 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa1c4485-5a7b-409a-9e37-618b66589fb8-config\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.408196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa1c4485-5a7b-409a-9e37-618b66589fb8-proxy-ca-bundles\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.408273 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa1c4485-5a7b-409a-9e37-618b66589fb8-serving-cert\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.509713 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnptn\" (UniqueName: \"kubernetes.io/projected/aa1c4485-5a7b-409a-9e37-618b66589fb8-kube-api-access-fnptn\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.509776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa1c4485-5a7b-409a-9e37-618b66589fb8-client-ca\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.509806 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f4jc\" (UniqueName: \"kubernetes.io/projected/c384434d-0798-4423-af64-25727390d372-kube-api-access-4f4jc\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.509830 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c384434d-0798-4423-af64-25727390d372-serving-cert\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.509867 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa1c4485-5a7b-409a-9e37-618b66589fb8-config\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.509910 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa1c4485-5a7b-409a-9e37-618b66589fb8-proxy-ca-bundles\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.509939 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa1c4485-5a7b-409a-9e37-618b66589fb8-serving-cert\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.509984 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c384434d-0798-4423-af64-25727390d372-config\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.510012 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c384434d-0798-4423-af64-25727390d372-client-ca\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.510821 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa1c4485-5a7b-409a-9e37-618b66589fb8-client-ca\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.511356 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa1c4485-5a7b-409a-9e37-618b66589fb8-proxy-ca-bundles\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.512082 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa1c4485-5a7b-409a-9e37-618b66589fb8-config\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.512429 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c384434d-0798-4423-af64-25727390d372-config\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.512477 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c384434d-0798-4423-af64-25727390d372-client-ca\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.513755 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c384434d-0798-4423-af64-25727390d372-serving-cert\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.514204 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa1c4485-5a7b-409a-9e37-618b66589fb8-serving-cert\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.526581 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f4jc\" (UniqueName: \"kubernetes.io/projected/c384434d-0798-4423-af64-25727390d372-kube-api-access-4f4jc\") pod \"route-controller-manager-7d65f46cd8-xthqk\" (UID: \"c384434d-0798-4423-af64-25727390d372\") " pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.532914 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnptn\" (UniqueName: \"kubernetes.io/projected/aa1c4485-5a7b-409a-9e37-618b66589fb8-kube-api-access-fnptn\") pod \"controller-manager-5f848c4c65-7fd8z\" (UID: \"aa1c4485-5a7b-409a-9e37-618b66589fb8\") " pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.585246 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.600731 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.613113 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" event={"ID":"d45651e4-469d-458c-9d48-ad996f82c3f0","Type":"ContainerStarted","Data":"ed67b36bd3b44f8d0184ed025b291ddc5fac3cd8a4d7012a158d537ec3f2110c"} Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.613820 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.616203 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" event={"ID":"d604b23e-ed8c-486f-b7d5-5a5ddd308947","Type":"ContainerDied","Data":"255e42b188080997a5dea4e08ffe0b07a757383c89c3f1d2bb04affcee905698"} Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.616439 4735 scope.go:117] "RemoveContainer" containerID="ed6898fb34b96da340f78c86273757ef5b076af059a3c68cf78af2621a239e21" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.616461 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.619376 4735 generic.go:334] "Generic (PLEG): container finished" podID="cfedcee2-b1bb-4709-af40-1d2c309be304" containerID="007485ef8c226d523f26559e273799722851798f90589448ffdce9e85ebe8f4f" exitCode=0 Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.619833 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w6xhg" event={"ID":"cfedcee2-b1bb-4709-af40-1d2c309be304","Type":"ContainerDied","Data":"007485ef8c226d523f26559e273799722851798f90589448ffdce9e85ebe8f4f"} Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.623397 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" event={"ID":"bf1e6345-2675-413e-bd53-d456e57b08bd","Type":"ContainerDied","Data":"d34ae32d5f39d406d414c6692dcccd73354c2ed3f897262a19805b7e5d30cfb2"} Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.623521 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wbs79" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.637063 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" podStartSLOduration=1.9638912469999998 podStartE2EDuration="8.637041653s" podCreationTimestamp="2025-10-01 10:29:19 +0000 UTC" firstStartedPulling="2025-10-01 10:29:20.011765889 +0000 UTC m=+718.704587151" lastFinishedPulling="2025-10-01 10:29:26.684916255 +0000 UTC m=+725.377737557" observedRunningTime="2025-10-01 10:29:27.633601031 +0000 UTC m=+726.326422313" watchObservedRunningTime="2025-10-01 10:29:27.637041653 +0000 UTC m=+726.329862915" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.718308 4735 scope.go:117] "RemoveContainer" containerID="1f9e2fa9e342bdbb074acae0e27f096f6a2f5bcd3bae1ed067cf1e4ec58a1b32" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.722298 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj"] Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.741673 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-625dj"] Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.765559 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbs79"] Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.768674 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wbs79"] Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.906765 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1e6345-2675-413e-bd53-d456e57b08bd" path="/var/lib/kubelet/pods/bf1e6345-2675-413e-bd53-d456e57b08bd/volumes" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.907650 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d604b23e-ed8c-486f-b7d5-5a5ddd308947" path="/var/lib/kubelet/pods/d604b23e-ed8c-486f-b7d5-5a5ddd308947/volumes" Oct 01 10:29:27 crc kubenswrapper[4735]: I1001 10:29:27.987093 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f848c4c65-7fd8z"] Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.142213 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk"] Oct 01 10:29:28 crc kubenswrapper[4735]: W1001 10:29:28.146481 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc384434d_0798_4423_af64_25727390d372.slice/crio-81cd09bab22c56326dfd962dd4a644ce3347b99d2e02548359f7c9a9af4047b1 WatchSource:0}: Error finding container 81cd09bab22c56326dfd962dd4a644ce3347b99d2e02548359f7c9a9af4047b1: Status 404 returned error can't find the container with id 81cd09bab22c56326dfd962dd4a644ce3347b99d2e02548359f7c9a9af4047b1 Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.631877 4735 generic.go:334] "Generic (PLEG): container finished" podID="cfedcee2-b1bb-4709-af40-1d2c309be304" containerID="3d7fd32cc0d519475a32046fb79d243abec125e18dbce252929bb00a415d60df" exitCode=0 Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.631952 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w6xhg" event={"ID":"cfedcee2-b1bb-4709-af40-1d2c309be304","Type":"ContainerDied","Data":"3d7fd32cc0d519475a32046fb79d243abec125e18dbce252929bb00a415d60df"} Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.636256 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" event={"ID":"aa1c4485-5a7b-409a-9e37-618b66589fb8","Type":"ContainerStarted","Data":"1fae625061f255a24f41bc5751da109a9ab8216849880d5ee30578b8044b5dc5"} Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.636307 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" event={"ID":"aa1c4485-5a7b-409a-9e37-618b66589fb8","Type":"ContainerStarted","Data":"a2481715911734a8481e84d3177e4ec3e8d866bf241eb71a7b52fc2f037bbe99"} Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.636442 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.637955 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" event={"ID":"c384434d-0798-4423-af64-25727390d372","Type":"ContainerStarted","Data":"b9821d10ec4ee16919eb95ffbf681523665ac4162b9735692d4eabb198fe110a"} Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.637996 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" event={"ID":"c384434d-0798-4423-af64-25727390d372","Type":"ContainerStarted","Data":"81cd09bab22c56326dfd962dd4a644ce3347b99d2e02548359f7c9a9af4047b1"} Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.638184 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.641859 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.726543 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" podStartSLOduration=3.726517724 podStartE2EDuration="3.726517724s" podCreationTimestamp="2025-10-01 10:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:29:28.697643056 +0000 UTC m=+727.390464318" watchObservedRunningTime="2025-10-01 10:29:28.726517724 +0000 UTC m=+727.419338986" Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.727203 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f848c4c65-7fd8z" podStartSLOduration=3.727199072 podStartE2EDuration="3.727199072s" podCreationTimestamp="2025-10-01 10:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:29:28.718549262 +0000 UTC m=+727.411370524" watchObservedRunningTime="2025-10-01 10:29:28.727199072 +0000 UTC m=+727.420020334" Oct 01 10:29:28 crc kubenswrapper[4735]: I1001 10:29:28.984614 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d65f46cd8-xthqk" Oct 01 10:29:29 crc kubenswrapper[4735]: I1001 10:29:29.653835 4735 generic.go:334] "Generic (PLEG): container finished" podID="cfedcee2-b1bb-4709-af40-1d2c309be304" containerID="398418ee2eab80144dbca43cf5271bcbde2f832d321c870401e9587632bbd3b6" exitCode=0 Oct 01 10:29:29 crc kubenswrapper[4735]: I1001 10:29:29.653944 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w6xhg" event={"ID":"cfedcee2-b1bb-4709-af40-1d2c309be304","Type":"ContainerDied","Data":"398418ee2eab80144dbca43cf5271bcbde2f832d321c870401e9587632bbd3b6"} Oct 01 10:29:30 crc kubenswrapper[4735]: I1001 10:29:30.348910 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-6nnhj" Oct 01 10:29:30 crc kubenswrapper[4735]: I1001 10:29:30.664843 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w6xhg" event={"ID":"cfedcee2-b1bb-4709-af40-1d2c309be304","Type":"ContainerStarted","Data":"a8024262be49798dd8c5baceb7ee56df63f1c0ab24d5ed77cb382e9f62dc5d26"} Oct 01 10:29:31 crc kubenswrapper[4735]: I1001 10:29:31.215710 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-9c82t" Oct 01 10:29:31 crc kubenswrapper[4735]: I1001 10:29:31.675851 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w6xhg" event={"ID":"cfedcee2-b1bb-4709-af40-1d2c309be304","Type":"ContainerStarted","Data":"e5f6aaafa351c4a6a310e396f72f692fe5d9b71fb8bc6190131f15331d7c833a"} Oct 01 10:29:31 crc kubenswrapper[4735]: I1001 10:29:31.675903 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w6xhg" event={"ID":"cfedcee2-b1bb-4709-af40-1d2c309be304","Type":"ContainerStarted","Data":"02cafeef1536c33b89d8182a14aa9c9f33c84a88de98108c7b879bbefee979c7"} Oct 01 10:29:31 crc kubenswrapper[4735]: I1001 10:29:31.675916 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w6xhg" event={"ID":"cfedcee2-b1bb-4709-af40-1d2c309be304","Type":"ContainerStarted","Data":"37c1e1bae5be4cf809de9646022cc2ff7818d39dd86b332591c7a1c6cbdecf17"} Oct 01 10:29:31 crc kubenswrapper[4735]: I1001 10:29:31.675925 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w6xhg" event={"ID":"cfedcee2-b1bb-4709-af40-1d2c309be304","Type":"ContainerStarted","Data":"f5f60d7de6bbeeba29dc02050d0764713fd96f34a0681608da25c0295fa5885e"} Oct 01 10:29:32 crc kubenswrapper[4735]: I1001 10:29:32.692926 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w6xhg" event={"ID":"cfedcee2-b1bb-4709-af40-1d2c309be304","Type":"ContainerStarted","Data":"bca0de8f1a8d0b6e6d021fbfaace94691d9fb5790c6240bbe8f3337d8ef11417"} Oct 01 10:29:32 crc kubenswrapper[4735]: I1001 10:29:32.693371 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:32 crc kubenswrapper[4735]: I1001 10:29:32.722979 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-w6xhg" podStartSLOduration=6.813324978 podStartE2EDuration="13.722959175s" podCreationTimestamp="2025-10-01 10:29:19 +0000 UTC" firstStartedPulling="2025-10-01 10:29:19.75471054 +0000 UTC m=+718.447531802" lastFinishedPulling="2025-10-01 10:29:26.664344727 +0000 UTC m=+725.357165999" observedRunningTime="2025-10-01 10:29:32.719136534 +0000 UTC m=+731.411957796" watchObservedRunningTime="2025-10-01 10:29:32.722959175 +0000 UTC m=+731.415780437" Oct 01 10:29:33 crc kubenswrapper[4735]: I1001 10:29:33.877466 4735 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 10:29:34 crc kubenswrapper[4735]: I1001 10:29:34.598415 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-szk4c"] Oct 01 10:29:34 crc kubenswrapper[4735]: I1001 10:29:34.603752 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-szk4c" Oct 01 10:29:34 crc kubenswrapper[4735]: I1001 10:29:34.610624 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 01 10:29:34 crc kubenswrapper[4735]: I1001 10:29:34.611153 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 01 10:29:34 crc kubenswrapper[4735]: I1001 10:29:34.615191 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-szk4c"] Oct 01 10:29:34 crc kubenswrapper[4735]: I1001 10:29:34.616265 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:34 crc kubenswrapper[4735]: I1001 10:29:34.680572 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:34 crc kubenswrapper[4735]: I1001 10:29:34.711188 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlwbj\" (UniqueName: \"kubernetes.io/projected/273a8e2f-e052-41c0-a1d0-677b8e7dd9ef-kube-api-access-qlwbj\") pod \"openstack-operator-index-szk4c\" (UID: \"273a8e2f-e052-41c0-a1d0-677b8e7dd9ef\") " pod="openstack-operators/openstack-operator-index-szk4c" Oct 01 10:29:34 crc kubenswrapper[4735]: I1001 10:29:34.812889 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlwbj\" (UniqueName: \"kubernetes.io/projected/273a8e2f-e052-41c0-a1d0-677b8e7dd9ef-kube-api-access-qlwbj\") pod \"openstack-operator-index-szk4c\" (UID: \"273a8e2f-e052-41c0-a1d0-677b8e7dd9ef\") " pod="openstack-operators/openstack-operator-index-szk4c" Oct 01 10:29:34 crc kubenswrapper[4735]: I1001 10:29:34.835683 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlwbj\" (UniqueName: \"kubernetes.io/projected/273a8e2f-e052-41c0-a1d0-677b8e7dd9ef-kube-api-access-qlwbj\") pod \"openstack-operator-index-szk4c\" (UID: \"273a8e2f-e052-41c0-a1d0-677b8e7dd9ef\") " pod="openstack-operators/openstack-operator-index-szk4c" Oct 01 10:29:34 crc kubenswrapper[4735]: I1001 10:29:34.922232 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-szk4c" Oct 01 10:29:35 crc kubenswrapper[4735]: I1001 10:29:35.321304 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-szk4c"] Oct 01 10:29:35 crc kubenswrapper[4735]: W1001 10:29:35.326174 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod273a8e2f_e052_41c0_a1d0_677b8e7dd9ef.slice/crio-8dd27c96cbfa7fe106bd68f0b1b4846a6bd55d8c9b9e8fd9e2499cb0df7dbede WatchSource:0}: Error finding container 8dd27c96cbfa7fe106bd68f0b1b4846a6bd55d8c9b9e8fd9e2499cb0df7dbede: Status 404 returned error can't find the container with id 8dd27c96cbfa7fe106bd68f0b1b4846a6bd55d8c9b9e8fd9e2499cb0df7dbede Oct 01 10:29:35 crc kubenswrapper[4735]: I1001 10:29:35.486207 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:29:35 crc kubenswrapper[4735]: I1001 10:29:35.486289 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:29:35 crc kubenswrapper[4735]: I1001 10:29:35.710650 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-szk4c" event={"ID":"273a8e2f-e052-41c0-a1d0-677b8e7dd9ef","Type":"ContainerStarted","Data":"8dd27c96cbfa7fe106bd68f0b1b4846a6bd55d8c9b9e8fd9e2499cb0df7dbede"} Oct 01 10:29:38 crc kubenswrapper[4735]: I1001 10:29:38.736713 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-szk4c" event={"ID":"273a8e2f-e052-41c0-a1d0-677b8e7dd9ef","Type":"ContainerStarted","Data":"75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364"} Oct 01 10:29:38 crc kubenswrapper[4735]: I1001 10:29:38.764181 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-szk4c" podStartSLOduration=2.038784561 podStartE2EDuration="4.76414619s" podCreationTimestamp="2025-10-01 10:29:34 +0000 UTC" firstStartedPulling="2025-10-01 10:29:35.328809185 +0000 UTC m=+734.021630447" lastFinishedPulling="2025-10-01 10:29:38.054170814 +0000 UTC m=+736.746992076" observedRunningTime="2025-10-01 10:29:38.759144287 +0000 UTC m=+737.451965589" watchObservedRunningTime="2025-10-01 10:29:38.76414619 +0000 UTC m=+737.456967492" Oct 01 10:29:38 crc kubenswrapper[4735]: I1001 10:29:38.775422 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-szk4c"] Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.580429 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jmdr7"] Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.581611 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jmdr7" Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.584374 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-v859q" Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.617676 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jmdr7"] Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.620789 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-w6xhg" Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.640188 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-cgg5l" Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.694629 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx8qf\" (UniqueName: \"kubernetes.io/projected/bf036ed7-e2ad-407c-94d5-ce386d9884b8-kube-api-access-kx8qf\") pod \"openstack-operator-index-jmdr7\" (UID: \"bf036ed7-e2ad-407c-94d5-ce386d9884b8\") " pod="openstack-operators/openstack-operator-index-jmdr7" Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.796253 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx8qf\" (UniqueName: \"kubernetes.io/projected/bf036ed7-e2ad-407c-94d5-ce386d9884b8-kube-api-access-kx8qf\") pod \"openstack-operator-index-jmdr7\" (UID: \"bf036ed7-e2ad-407c-94d5-ce386d9884b8\") " pod="openstack-operators/openstack-operator-index-jmdr7" Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.815866 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx8qf\" (UniqueName: \"kubernetes.io/projected/bf036ed7-e2ad-407c-94d5-ce386d9884b8-kube-api-access-kx8qf\") pod \"openstack-operator-index-jmdr7\" (UID: \"bf036ed7-e2ad-407c-94d5-ce386d9884b8\") " pod="openstack-operators/openstack-operator-index-jmdr7" Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.925380 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jmdr7" Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.987489 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5g2md"] Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.989209 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:39 crc kubenswrapper[4735]: I1001 10:29:39.997808 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5g2md"] Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.100044 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-catalog-content\") pod \"community-operators-5g2md\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.100423 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-666dp\" (UniqueName: \"kubernetes.io/projected/379a0f6c-7c60-4328-9127-5cecc876bf76-kube-api-access-666dp\") pod \"community-operators-5g2md\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.100478 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-utilities\") pod \"community-operators-5g2md\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.202182 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-utilities\") pod \"community-operators-5g2md\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.202309 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-catalog-content\") pod \"community-operators-5g2md\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.202336 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-666dp\" (UniqueName: \"kubernetes.io/projected/379a0f6c-7c60-4328-9127-5cecc876bf76-kube-api-access-666dp\") pod \"community-operators-5g2md\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.204721 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-catalog-content\") pod \"community-operators-5g2md\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.204740 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-utilities\") pod \"community-operators-5g2md\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.222956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-666dp\" (UniqueName: \"kubernetes.io/projected/379a0f6c-7c60-4328-9127-5cecc876bf76-kube-api-access-666dp\") pod \"community-operators-5g2md\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.315185 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.372415 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jmdr7"] Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.750995 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jmdr7" event={"ID":"bf036ed7-e2ad-407c-94d5-ce386d9884b8","Type":"ContainerStarted","Data":"570d97bfec6866a394823d594ca3339c2967a6395657ee4e321029409f097f16"} Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.751411 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jmdr7" event={"ID":"bf036ed7-e2ad-407c-94d5-ce386d9884b8","Type":"ContainerStarted","Data":"7385f0406ca608e5ab95e856c177e4ca6bd8676af7c8578f13c63a2c50f62042"} Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.751170 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-szk4c" podUID="273a8e2f-e052-41c0-a1d0-677b8e7dd9ef" containerName="registry-server" containerID="cri-o://75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364" gracePeriod=2 Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.771307 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jmdr7" podStartSLOduration=1.709469898 podStartE2EDuration="1.771283413s" podCreationTimestamp="2025-10-01 10:29:39 +0000 UTC" firstStartedPulling="2025-10-01 10:29:40.401709002 +0000 UTC m=+739.094530264" lastFinishedPulling="2025-10-01 10:29:40.463522517 +0000 UTC m=+739.156343779" observedRunningTime="2025-10-01 10:29:40.769454285 +0000 UTC m=+739.462275557" watchObservedRunningTime="2025-10-01 10:29:40.771283413 +0000 UTC m=+739.464104675" Oct 01 10:29:40 crc kubenswrapper[4735]: I1001 10:29:40.787672 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5g2md"] Oct 01 10:29:40 crc kubenswrapper[4735]: W1001 10:29:40.843585 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod379a0f6c_7c60_4328_9127_5cecc876bf76.slice/crio-2e35d809aeaaab35c3635946e3dad7047cb90018061c212ac7ca5895105f1ef9 WatchSource:0}: Error finding container 2e35d809aeaaab35c3635946e3dad7047cb90018061c212ac7ca5895105f1ef9: Status 404 returned error can't find the container with id 2e35d809aeaaab35c3635946e3dad7047cb90018061c212ac7ca5895105f1ef9 Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.234395 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-szk4c" Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.419694 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlwbj\" (UniqueName: \"kubernetes.io/projected/273a8e2f-e052-41c0-a1d0-677b8e7dd9ef-kube-api-access-qlwbj\") pod \"273a8e2f-e052-41c0-a1d0-677b8e7dd9ef\" (UID: \"273a8e2f-e052-41c0-a1d0-677b8e7dd9ef\") " Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.425644 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/273a8e2f-e052-41c0-a1d0-677b8e7dd9ef-kube-api-access-qlwbj" (OuterVolumeSpecName: "kube-api-access-qlwbj") pod "273a8e2f-e052-41c0-a1d0-677b8e7dd9ef" (UID: "273a8e2f-e052-41c0-a1d0-677b8e7dd9ef"). InnerVolumeSpecName "kube-api-access-qlwbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.521208 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlwbj\" (UniqueName: \"kubernetes.io/projected/273a8e2f-e052-41c0-a1d0-677b8e7dd9ef-kube-api-access-qlwbj\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.762469 4735 generic.go:334] "Generic (PLEG): container finished" podID="273a8e2f-e052-41c0-a1d0-677b8e7dd9ef" containerID="75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364" exitCode=0 Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.762622 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-szk4c" event={"ID":"273a8e2f-e052-41c0-a1d0-677b8e7dd9ef","Type":"ContainerDied","Data":"75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364"} Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.762669 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-szk4c" event={"ID":"273a8e2f-e052-41c0-a1d0-677b8e7dd9ef","Type":"ContainerDied","Data":"8dd27c96cbfa7fe106bd68f0b1b4846a6bd55d8c9b9e8fd9e2499cb0df7dbede"} Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.762676 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-szk4c" Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.762700 4735 scope.go:117] "RemoveContainer" containerID="75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364" Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.765909 4735 generic.go:334] "Generic (PLEG): container finished" podID="379a0f6c-7c60-4328-9127-5cecc876bf76" containerID="4d0e78459fb58210bddad9f420ff2ed79c86c920d5080e1fc095c63b24d46215" exitCode=0 Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.766041 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g2md" event={"ID":"379a0f6c-7c60-4328-9127-5cecc876bf76","Type":"ContainerDied","Data":"4d0e78459fb58210bddad9f420ff2ed79c86c920d5080e1fc095c63b24d46215"} Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.766088 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g2md" event={"ID":"379a0f6c-7c60-4328-9127-5cecc876bf76","Type":"ContainerStarted","Data":"2e35d809aeaaab35c3635946e3dad7047cb90018061c212ac7ca5895105f1ef9"} Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.783930 4735 scope.go:117] "RemoveContainer" containerID="75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364" Oct 01 10:29:41 crc kubenswrapper[4735]: E1001 10:29:41.784569 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364\": container with ID starting with 75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364 not found: ID does not exist" containerID="75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364" Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.784703 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364"} err="failed to get container status \"75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364\": rpc error: code = NotFound desc = could not find container \"75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364\": container with ID starting with 75cb9a765ef64aafc50cc5b97370fe981f9f511287010f4afc649d5ab0bf4364 not found: ID does not exist" Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.818975 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-szk4c"] Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.820630 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-szk4c"] Oct 01 10:29:41 crc kubenswrapper[4735]: I1001 10:29:41.910585 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="273a8e2f-e052-41c0-a1d0-677b8e7dd9ef" path="/var/lib/kubelet/pods/273a8e2f-e052-41c0-a1d0-677b8e7dd9ef/volumes" Oct 01 10:29:43 crc kubenswrapper[4735]: I1001 10:29:43.790247 4735 generic.go:334] "Generic (PLEG): container finished" podID="379a0f6c-7c60-4328-9127-5cecc876bf76" containerID="348fef55d55218b0fa1350194df6f902900631e586d48c230dd57c6288c47550" exitCode=0 Oct 01 10:29:43 crc kubenswrapper[4735]: I1001 10:29:43.790345 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g2md" event={"ID":"379a0f6c-7c60-4328-9127-5cecc876bf76","Type":"ContainerDied","Data":"348fef55d55218b0fa1350194df6f902900631e586d48c230dd57c6288c47550"} Oct 01 10:29:45 crc kubenswrapper[4735]: I1001 10:29:45.805556 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g2md" event={"ID":"379a0f6c-7c60-4328-9127-5cecc876bf76","Type":"ContainerStarted","Data":"5cd59d147c83a16c692a3709729090e852e5d71525e9fe29981dc9f59b3ef5b3"} Oct 01 10:29:45 crc kubenswrapper[4735]: I1001 10:29:45.825981 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5g2md" podStartSLOduration=3.943547049 podStartE2EDuration="6.825962236s" podCreationTimestamp="2025-10-01 10:29:39 +0000 UTC" firstStartedPulling="2025-10-01 10:29:41.767945936 +0000 UTC m=+740.460767238" lastFinishedPulling="2025-10-01 10:29:44.650361123 +0000 UTC m=+743.343182425" observedRunningTime="2025-10-01 10:29:45.823932202 +0000 UTC m=+744.516753464" watchObservedRunningTime="2025-10-01 10:29:45.825962236 +0000 UTC m=+744.518783498" Oct 01 10:29:49 crc kubenswrapper[4735]: I1001 10:29:49.926550 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-jmdr7" Oct 01 10:29:49 crc kubenswrapper[4735]: I1001 10:29:49.927215 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-jmdr7" Oct 01 10:29:49 crc kubenswrapper[4735]: I1001 10:29:49.968960 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-jmdr7" Oct 01 10:29:50 crc kubenswrapper[4735]: I1001 10:29:50.316294 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:50 crc kubenswrapper[4735]: I1001 10:29:50.316389 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:50 crc kubenswrapper[4735]: I1001 10:29:50.370405 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:50 crc kubenswrapper[4735]: I1001 10:29:50.883347 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-jmdr7" Oct 01 10:29:50 crc kubenswrapper[4735]: I1001 10:29:50.902612 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.022971 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc"] Oct 01 10:29:53 crc kubenswrapper[4735]: E1001 10:29:53.023537 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273a8e2f-e052-41c0-a1d0-677b8e7dd9ef" containerName="registry-server" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.023553 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="273a8e2f-e052-41c0-a1d0-677b8e7dd9ef" containerName="registry-server" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.023727 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="273a8e2f-e052-41c0-a1d0-677b8e7dd9ef" containerName="registry-server" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.024857 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.028302 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mbhnr" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.031421 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc"] Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.184605 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqlnj\" (UniqueName: \"kubernetes.io/projected/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-kube-api-access-xqlnj\") pod \"7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.184655 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-util\") pod \"7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.184689 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-bundle\") pod \"7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.286111 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqlnj\" (UniqueName: \"kubernetes.io/projected/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-kube-api-access-xqlnj\") pod \"7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.286161 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-util\") pod \"7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.286192 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-bundle\") pod \"7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.286822 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-bundle\") pod \"7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.287249 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-util\") pod \"7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.306426 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqlnj\" (UniqueName: \"kubernetes.io/projected/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-kube-api-access-xqlnj\") pod \"7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.348588 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.772895 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc"] Oct 01 10:29:53 crc kubenswrapper[4735]: W1001 10:29:53.781064 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod820a7ecc_7cdd_4ef7_8b74_64419cb96b2d.slice/crio-a07d7e6f8f7e96f4e6f8856d57d073bb9624de6a9ba374d1248a1ff29fcb0520 WatchSource:0}: Error finding container a07d7e6f8f7e96f4e6f8856d57d073bb9624de6a9ba374d1248a1ff29fcb0520: Status 404 returned error can't find the container with id a07d7e6f8f7e96f4e6f8856d57d073bb9624de6a9ba374d1248a1ff29fcb0520 Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.865090 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" event={"ID":"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d","Type":"ContainerStarted","Data":"a07d7e6f8f7e96f4e6f8856d57d073bb9624de6a9ba374d1248a1ff29fcb0520"} Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.973391 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5g2md"] Oct 01 10:29:53 crc kubenswrapper[4735]: I1001 10:29:53.974101 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5g2md" podUID="379a0f6c-7c60-4328-9127-5cecc876bf76" containerName="registry-server" containerID="cri-o://5cd59d147c83a16c692a3709729090e852e5d71525e9fe29981dc9f59b3ef5b3" gracePeriod=2 Oct 01 10:29:54 crc kubenswrapper[4735]: I1001 10:29:54.879639 4735 generic.go:334] "Generic (PLEG): container finished" podID="379a0f6c-7c60-4328-9127-5cecc876bf76" containerID="5cd59d147c83a16c692a3709729090e852e5d71525e9fe29981dc9f59b3ef5b3" exitCode=0 Oct 01 10:29:54 crc kubenswrapper[4735]: I1001 10:29:54.879920 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g2md" event={"ID":"379a0f6c-7c60-4328-9127-5cecc876bf76","Type":"ContainerDied","Data":"5cd59d147c83a16c692a3709729090e852e5d71525e9fe29981dc9f59b3ef5b3"} Oct 01 10:29:54 crc kubenswrapper[4735]: I1001 10:29:54.887807 4735 generic.go:334] "Generic (PLEG): container finished" podID="820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" containerID="8512c58076eb594f896c77e5bd013e100c124f4968a8273c42cd74afd650be9f" exitCode=0 Oct 01 10:29:54 crc kubenswrapper[4735]: I1001 10:29:54.887876 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" event={"ID":"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d","Type":"ContainerDied","Data":"8512c58076eb594f896c77e5bd013e100c124f4968a8273c42cd74afd650be9f"} Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.109843 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.214645 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-utilities\") pod \"379a0f6c-7c60-4328-9127-5cecc876bf76\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.214776 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-666dp\" (UniqueName: \"kubernetes.io/projected/379a0f6c-7c60-4328-9127-5cecc876bf76-kube-api-access-666dp\") pod \"379a0f6c-7c60-4328-9127-5cecc876bf76\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.214854 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-catalog-content\") pod \"379a0f6c-7c60-4328-9127-5cecc876bf76\" (UID: \"379a0f6c-7c60-4328-9127-5cecc876bf76\") " Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.216370 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-utilities" (OuterVolumeSpecName: "utilities") pod "379a0f6c-7c60-4328-9127-5cecc876bf76" (UID: "379a0f6c-7c60-4328-9127-5cecc876bf76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.235085 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379a0f6c-7c60-4328-9127-5cecc876bf76-kube-api-access-666dp" (OuterVolumeSpecName: "kube-api-access-666dp") pod "379a0f6c-7c60-4328-9127-5cecc876bf76" (UID: "379a0f6c-7c60-4328-9127-5cecc876bf76"). InnerVolumeSpecName "kube-api-access-666dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.264412 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "379a0f6c-7c60-4328-9127-5cecc876bf76" (UID: "379a0f6c-7c60-4328-9127-5cecc876bf76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.316156 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-666dp\" (UniqueName: \"kubernetes.io/projected/379a0f6c-7c60-4328-9127-5cecc876bf76-kube-api-access-666dp\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.316190 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.316200 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379a0f6c-7c60-4328-9127-5cecc876bf76-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.905210 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5g2md" Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.907482 4735 generic.go:334] "Generic (PLEG): container finished" podID="820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" containerID="8fe09f87962b5f48a1f63ad39c0a7ec257ff362ad79bc3f999a07d57dea15503" exitCode=0 Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.908266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5g2md" event={"ID":"379a0f6c-7c60-4328-9127-5cecc876bf76","Type":"ContainerDied","Data":"2e35d809aeaaab35c3635946e3dad7047cb90018061c212ac7ca5895105f1ef9"} Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.908561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" event={"ID":"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d","Type":"ContainerDied","Data":"8fe09f87962b5f48a1f63ad39c0a7ec257ff362ad79bc3f999a07d57dea15503"} Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.908610 4735 scope.go:117] "RemoveContainer" containerID="5cd59d147c83a16c692a3709729090e852e5d71525e9fe29981dc9f59b3ef5b3" Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.944354 4735 scope.go:117] "RemoveContainer" containerID="348fef55d55218b0fa1350194df6f902900631e586d48c230dd57c6288c47550" Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.948144 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5g2md"] Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.955168 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5g2md"] Oct 01 10:29:55 crc kubenswrapper[4735]: I1001 10:29:55.984472 4735 scope.go:117] "RemoveContainer" containerID="4d0e78459fb58210bddad9f420ff2ed79c86c920d5080e1fc095c63b24d46215" Oct 01 10:29:56 crc kubenswrapper[4735]: I1001 10:29:56.915727 4735 generic.go:334] "Generic (PLEG): container finished" podID="820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" containerID="6de6739c27df263611b02a39f0ab8c535d2d652b2f97fc9a7ff071a739729ed4" exitCode=0 Oct 01 10:29:56 crc kubenswrapper[4735]: I1001 10:29:56.915813 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" event={"ID":"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d","Type":"ContainerDied","Data":"6de6739c27df263611b02a39f0ab8c535d2d652b2f97fc9a7ff071a739729ed4"} Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.185357 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4rv2l"] Oct 01 10:29:57 crc kubenswrapper[4735]: E1001 10:29:57.185926 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379a0f6c-7c60-4328-9127-5cecc876bf76" containerName="registry-server" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.185982 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="379a0f6c-7c60-4328-9127-5cecc876bf76" containerName="registry-server" Oct 01 10:29:57 crc kubenswrapper[4735]: E1001 10:29:57.186026 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379a0f6c-7c60-4328-9127-5cecc876bf76" containerName="extract-content" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.186046 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="379a0f6c-7c60-4328-9127-5cecc876bf76" containerName="extract-content" Oct 01 10:29:57 crc kubenswrapper[4735]: E1001 10:29:57.186077 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379a0f6c-7c60-4328-9127-5cecc876bf76" containerName="extract-utilities" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.186095 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="379a0f6c-7c60-4328-9127-5cecc876bf76" containerName="extract-utilities" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.186357 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="379a0f6c-7c60-4328-9127-5cecc876bf76" containerName="registry-server" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.188358 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.199731 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rv2l"] Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.343379 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsfft\" (UniqueName: \"kubernetes.io/projected/51fd9d7c-246d-4279-a6ea-82bc5e697058-kube-api-access-qsfft\") pod \"certified-operators-4rv2l\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.343473 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-catalog-content\") pod \"certified-operators-4rv2l\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.343515 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-utilities\") pod \"certified-operators-4rv2l\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.444878 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsfft\" (UniqueName: \"kubernetes.io/projected/51fd9d7c-246d-4279-a6ea-82bc5e697058-kube-api-access-qsfft\") pod \"certified-operators-4rv2l\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.444961 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-catalog-content\") pod \"certified-operators-4rv2l\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.444978 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-utilities\") pod \"certified-operators-4rv2l\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.445383 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-catalog-content\") pod \"certified-operators-4rv2l\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.445412 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-utilities\") pod \"certified-operators-4rv2l\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.461806 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsfft\" (UniqueName: \"kubernetes.io/projected/51fd9d7c-246d-4279-a6ea-82bc5e697058-kube-api-access-qsfft\") pod \"certified-operators-4rv2l\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.506023 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.905559 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379a0f6c-7c60-4328-9127-5cecc876bf76" path="/var/lib/kubelet/pods/379a0f6c-7c60-4328-9127-5cecc876bf76/volumes" Oct 01 10:29:57 crc kubenswrapper[4735]: I1001 10:29:57.981077 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rv2l"] Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.240974 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.357696 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqlnj\" (UniqueName: \"kubernetes.io/projected/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-kube-api-access-xqlnj\") pod \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.357807 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-bundle\") pod \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.357861 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-util\") pod \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\" (UID: \"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d\") " Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.358824 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-bundle" (OuterVolumeSpecName: "bundle") pod "820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" (UID: "820a7ecc-7cdd-4ef7-8b74-64419cb96b2d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.362174 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-kube-api-access-xqlnj" (OuterVolumeSpecName: "kube-api-access-xqlnj") pod "820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" (UID: "820a7ecc-7cdd-4ef7-8b74-64419cb96b2d"). InnerVolumeSpecName "kube-api-access-xqlnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.371615 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-util" (OuterVolumeSpecName: "util") pod "820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" (UID: "820a7ecc-7cdd-4ef7-8b74-64419cb96b2d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.459361 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.459402 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-util\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.459415 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqlnj\" (UniqueName: \"kubernetes.io/projected/820a7ecc-7cdd-4ef7-8b74-64419cb96b2d-kube-api-access-xqlnj\") on node \"crc\" DevicePath \"\"" Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.932072 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.932046 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc" event={"ID":"820a7ecc-7cdd-4ef7-8b74-64419cb96b2d","Type":"ContainerDied","Data":"a07d7e6f8f7e96f4e6f8856d57d073bb9624de6a9ba374d1248a1ff29fcb0520"} Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.932136 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a07d7e6f8f7e96f4e6f8856d57d073bb9624de6a9ba374d1248a1ff29fcb0520" Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.934198 4735 generic.go:334] "Generic (PLEG): container finished" podID="51fd9d7c-246d-4279-a6ea-82bc5e697058" containerID="b3d638c133c224db8df57588c6eaac7106874f75eada69d43f09be63891627f2" exitCode=0 Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.934268 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rv2l" event={"ID":"51fd9d7c-246d-4279-a6ea-82bc5e697058","Type":"ContainerDied","Data":"b3d638c133c224db8df57588c6eaac7106874f75eada69d43f09be63891627f2"} Oct 01 10:29:58 crc kubenswrapper[4735]: I1001 10:29:58.934312 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rv2l" event={"ID":"51fd9d7c-246d-4279-a6ea-82bc5e697058","Type":"ContainerStarted","Data":"228469ecb4653ce951daf6858ca6f1b9a611eb5f3a67e426a82257573f5d373a"} Oct 01 10:29:59 crc kubenswrapper[4735]: I1001 10:29:59.944280 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rv2l" event={"ID":"51fd9d7c-246d-4279-a6ea-82bc5e697058","Type":"ContainerStarted","Data":"9ecc5eed5aa7805a2818392d5baded129936a3562841087f98a6179dbf99eb96"} Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.134004 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj"] Oct 01 10:30:00 crc kubenswrapper[4735]: E1001 10:30:00.134397 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" containerName="util" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.134421 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" containerName="util" Oct 01 10:30:00 crc kubenswrapper[4735]: E1001 10:30:00.134436 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" containerName="pull" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.134445 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" containerName="pull" Oct 01 10:30:00 crc kubenswrapper[4735]: E1001 10:30:00.134463 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" containerName="extract" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.134472 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" containerName="extract" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.134640 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="820a7ecc-7cdd-4ef7-8b74-64419cb96b2d" containerName="extract" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.135153 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.140030 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.141095 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.144555 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj"] Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.285631 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgp8k\" (UniqueName: \"kubernetes.io/projected/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-kube-api-access-fgp8k\") pod \"collect-profiles-29321910-74whj\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.285800 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-secret-volume\") pod \"collect-profiles-29321910-74whj\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.285912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-config-volume\") pod \"collect-profiles-29321910-74whj\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.387973 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-config-volume\") pod \"collect-profiles-29321910-74whj\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.388083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgp8k\" (UniqueName: \"kubernetes.io/projected/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-kube-api-access-fgp8k\") pod \"collect-profiles-29321910-74whj\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.388184 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-secret-volume\") pod \"collect-profiles-29321910-74whj\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.389310 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-config-volume\") pod \"collect-profiles-29321910-74whj\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.396985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-secret-volume\") pod \"collect-profiles-29321910-74whj\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.407305 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgp8k\" (UniqueName: \"kubernetes.io/projected/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-kube-api-access-fgp8k\") pod \"collect-profiles-29321910-74whj\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.452760 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.781259 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6qncp"] Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.783407 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.791838 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qncp"] Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.896204 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm5lc\" (UniqueName: \"kubernetes.io/projected/b6a1b789-fd48-4452-be68-f5d108f5f208-kube-api-access-sm5lc\") pod \"redhat-operators-6qncp\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.896261 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-catalog-content\") pod \"redhat-operators-6qncp\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.896320 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-utilities\") pod \"redhat-operators-6qncp\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.912846 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj"] Oct 01 10:30:00 crc kubenswrapper[4735]: W1001 10:30:00.928231 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod692a51ac_dd61_47a9_ac24_64f95d1cb6d1.slice/crio-1a3728fce1f140be0fcb74c6a97eec9fa375efb5b374d1da75ff880eb2433526 WatchSource:0}: Error finding container 1a3728fce1f140be0fcb74c6a97eec9fa375efb5b374d1da75ff880eb2433526: Status 404 returned error can't find the container with id 1a3728fce1f140be0fcb74c6a97eec9fa375efb5b374d1da75ff880eb2433526 Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.963932 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" event={"ID":"692a51ac-dd61-47a9-ac24-64f95d1cb6d1","Type":"ContainerStarted","Data":"1a3728fce1f140be0fcb74c6a97eec9fa375efb5b374d1da75ff880eb2433526"} Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.977879 4735 generic.go:334] "Generic (PLEG): container finished" podID="51fd9d7c-246d-4279-a6ea-82bc5e697058" containerID="9ecc5eed5aa7805a2818392d5baded129936a3562841087f98a6179dbf99eb96" exitCode=0 Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.977958 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rv2l" event={"ID":"51fd9d7c-246d-4279-a6ea-82bc5e697058","Type":"ContainerDied","Data":"9ecc5eed5aa7805a2818392d5baded129936a3562841087f98a6179dbf99eb96"} Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.998474 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm5lc\" (UniqueName: \"kubernetes.io/projected/b6a1b789-fd48-4452-be68-f5d108f5f208-kube-api-access-sm5lc\") pod \"redhat-operators-6qncp\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.998639 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-catalog-content\") pod \"redhat-operators-6qncp\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.998783 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-utilities\") pod \"redhat-operators-6qncp\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.999481 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-catalog-content\") pod \"redhat-operators-6qncp\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:00 crc kubenswrapper[4735]: I1001 10:30:00.999615 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-utilities\") pod \"redhat-operators-6qncp\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:01 crc kubenswrapper[4735]: I1001 10:30:01.026291 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm5lc\" (UniqueName: \"kubernetes.io/projected/b6a1b789-fd48-4452-be68-f5d108f5f208-kube-api-access-sm5lc\") pod \"redhat-operators-6qncp\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:01 crc kubenswrapper[4735]: I1001 10:30:01.104053 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:01 crc kubenswrapper[4735]: I1001 10:30:01.518420 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6qncp"] Oct 01 10:30:01 crc kubenswrapper[4735]: I1001 10:30:01.984432 4735 generic.go:334] "Generic (PLEG): container finished" podID="b6a1b789-fd48-4452-be68-f5d108f5f208" containerID="8cdd3eee68cedd2ef188fb2a40ff8838351bef1873eff608d4ff340531d44683" exitCode=0 Oct 01 10:30:01 crc kubenswrapper[4735]: I1001 10:30:01.984508 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qncp" event={"ID":"b6a1b789-fd48-4452-be68-f5d108f5f208","Type":"ContainerDied","Data":"8cdd3eee68cedd2ef188fb2a40ff8838351bef1873eff608d4ff340531d44683"} Oct 01 10:30:01 crc kubenswrapper[4735]: I1001 10:30:01.984736 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qncp" event={"ID":"b6a1b789-fd48-4452-be68-f5d108f5f208","Type":"ContainerStarted","Data":"917bf69a6b91681315cc30b4e38103d4ae645e0aec6e8868fc87f035359ca384"} Oct 01 10:30:01 crc kubenswrapper[4735]: I1001 10:30:01.986663 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rv2l" event={"ID":"51fd9d7c-246d-4279-a6ea-82bc5e697058","Type":"ContainerStarted","Data":"386433bb6b314eabb9a5c9c7c257ac35ec82881748b82fda4937b6e5006c43d5"} Oct 01 10:30:01 crc kubenswrapper[4735]: I1001 10:30:01.988729 4735 generic.go:334] "Generic (PLEG): container finished" podID="692a51ac-dd61-47a9-ac24-64f95d1cb6d1" containerID="36450a9da66059fd1c7c7b8f6925bcae76f3eb7872468f1bbffe5c54dc67a179" exitCode=0 Oct 01 10:30:01 crc kubenswrapper[4735]: I1001 10:30:01.988779 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" event={"ID":"692a51ac-dd61-47a9-ac24-64f95d1cb6d1","Type":"ContainerDied","Data":"36450a9da66059fd1c7c7b8f6925bcae76f3eb7872468f1bbffe5c54dc67a179"} Oct 01 10:30:02 crc kubenswrapper[4735]: I1001 10:30:02.017758 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4rv2l" podStartSLOduration=3.040065448 podStartE2EDuration="5.017737157s" podCreationTimestamp="2025-10-01 10:29:57 +0000 UTC" firstStartedPulling="2025-10-01 10:29:58.936174272 +0000 UTC m=+757.628995574" lastFinishedPulling="2025-10-01 10:30:00.913846011 +0000 UTC m=+759.606667283" observedRunningTime="2025-10-01 10:30:02.015710002 +0000 UTC m=+760.708531264" watchObservedRunningTime="2025-10-01 10:30:02.017737157 +0000 UTC m=+760.710558419" Oct 01 10:30:03 crc kubenswrapper[4735]: I1001 10:30:03.298795 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:03 crc kubenswrapper[4735]: I1001 10:30:03.434245 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-secret-volume\") pod \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " Oct 01 10:30:03 crc kubenswrapper[4735]: I1001 10:30:03.434374 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-config-volume\") pod \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " Oct 01 10:30:03 crc kubenswrapper[4735]: I1001 10:30:03.435156 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgp8k\" (UniqueName: \"kubernetes.io/projected/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-kube-api-access-fgp8k\") pod \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\" (UID: \"692a51ac-dd61-47a9-ac24-64f95d1cb6d1\") " Oct 01 10:30:03 crc kubenswrapper[4735]: I1001 10:30:03.435161 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-config-volume" (OuterVolumeSpecName: "config-volume") pod "692a51ac-dd61-47a9-ac24-64f95d1cb6d1" (UID: "692a51ac-dd61-47a9-ac24-64f95d1cb6d1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:30:03 crc kubenswrapper[4735]: I1001 10:30:03.435618 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:03 crc kubenswrapper[4735]: I1001 10:30:03.439689 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-kube-api-access-fgp8k" (OuterVolumeSpecName: "kube-api-access-fgp8k") pod "692a51ac-dd61-47a9-ac24-64f95d1cb6d1" (UID: "692a51ac-dd61-47a9-ac24-64f95d1cb6d1"). InnerVolumeSpecName "kube-api-access-fgp8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:30:03 crc kubenswrapper[4735]: I1001 10:30:03.439883 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "692a51ac-dd61-47a9-ac24-64f95d1cb6d1" (UID: "692a51ac-dd61-47a9-ac24-64f95d1cb6d1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:30:03 crc kubenswrapper[4735]: I1001 10:30:03.536521 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgp8k\" (UniqueName: \"kubernetes.io/projected/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-kube-api-access-fgp8k\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:03 crc kubenswrapper[4735]: I1001 10:30:03.536564 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/692a51ac-dd61-47a9-ac24-64f95d1cb6d1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.004126 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.004144 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj" event={"ID":"692a51ac-dd61-47a9-ac24-64f95d1cb6d1","Type":"ContainerDied","Data":"1a3728fce1f140be0fcb74c6a97eec9fa375efb5b374d1da75ff880eb2433526"} Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.004639 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a3728fce1f140be0fcb74c6a97eec9fa375efb5b374d1da75ff880eb2433526" Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.006440 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qncp" event={"ID":"b6a1b789-fd48-4452-be68-f5d108f5f208","Type":"ContainerStarted","Data":"34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6"} Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.385048 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5"] Oct 01 10:30:04 crc kubenswrapper[4735]: E1001 10:30:04.385379 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="692a51ac-dd61-47a9-ac24-64f95d1cb6d1" containerName="collect-profiles" Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.385396 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="692a51ac-dd61-47a9-ac24-64f95d1cb6d1" containerName="collect-profiles" Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.385588 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="692a51ac-dd61-47a9-ac24-64f95d1cb6d1" containerName="collect-profiles" Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.386292 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5" Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.388812 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-6nmd8" Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.408555 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5"] Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.549354 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjpvp\" (UniqueName: \"kubernetes.io/projected/12603a12-a744-42f1-b0fd-e1a88e490d81-kube-api-access-pjpvp\") pod \"openstack-operator-controller-operator-794c859bbc-8kzp5\" (UID: \"12603a12-a744-42f1-b0fd-e1a88e490d81\") " pod="openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5" Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.651028 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjpvp\" (UniqueName: \"kubernetes.io/projected/12603a12-a744-42f1-b0fd-e1a88e490d81-kube-api-access-pjpvp\") pod \"openstack-operator-controller-operator-794c859bbc-8kzp5\" (UID: \"12603a12-a744-42f1-b0fd-e1a88e490d81\") " pod="openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5" Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.668956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjpvp\" (UniqueName: \"kubernetes.io/projected/12603a12-a744-42f1-b0fd-e1a88e490d81-kube-api-access-pjpvp\") pod \"openstack-operator-controller-operator-794c859bbc-8kzp5\" (UID: \"12603a12-a744-42f1-b0fd-e1a88e490d81\") " pod="openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5" Oct 01 10:30:04 crc kubenswrapper[4735]: I1001 10:30:04.700670 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.015639 4735 generic.go:334] "Generic (PLEG): container finished" podID="b6a1b789-fd48-4452-be68-f5d108f5f208" containerID="34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6" exitCode=0 Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.015681 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qncp" event={"ID":"b6a1b789-fd48-4452-be68-f5d108f5f208","Type":"ContainerDied","Data":"34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6"} Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.133848 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5"] Oct 01 10:30:05 crc kubenswrapper[4735]: W1001 10:30:05.139661 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12603a12_a744_42f1_b0fd_e1a88e490d81.slice/crio-c11d3aacc02017a1bbf9cb586663b1e73ad532f064e593fa83623bcdec16a2da WatchSource:0}: Error finding container c11d3aacc02017a1bbf9cb586663b1e73ad532f064e593fa83623bcdec16a2da: Status 404 returned error can't find the container with id c11d3aacc02017a1bbf9cb586663b1e73ad532f064e593fa83623bcdec16a2da Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.485224 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.485276 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.580657 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-669s7"] Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.581906 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.591158 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-669s7"] Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.670765 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-catalog-content\") pod \"redhat-marketplace-669s7\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.670838 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-utilities\") pod \"redhat-marketplace-669s7\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.670860 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bt4j\" (UniqueName: \"kubernetes.io/projected/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-kube-api-access-9bt4j\") pod \"redhat-marketplace-669s7\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.776014 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-catalog-content\") pod \"redhat-marketplace-669s7\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.776614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-utilities\") pod \"redhat-marketplace-669s7\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.776882 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-catalog-content\") pod \"redhat-marketplace-669s7\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.776095 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-utilities\") pod \"redhat-marketplace-669s7\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.777045 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bt4j\" (UniqueName: \"kubernetes.io/projected/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-kube-api-access-9bt4j\") pod \"redhat-marketplace-669s7\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.796278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bt4j\" (UniqueName: \"kubernetes.io/projected/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-kube-api-access-9bt4j\") pod \"redhat-marketplace-669s7\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:05 crc kubenswrapper[4735]: I1001 10:30:05.900050 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:06 crc kubenswrapper[4735]: I1001 10:30:06.025657 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qncp" event={"ID":"b6a1b789-fd48-4452-be68-f5d108f5f208","Type":"ContainerStarted","Data":"bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35"} Oct 01 10:30:06 crc kubenswrapper[4735]: I1001 10:30:06.044456 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5" event={"ID":"12603a12-a744-42f1-b0fd-e1a88e490d81","Type":"ContainerStarted","Data":"c11d3aacc02017a1bbf9cb586663b1e73ad532f064e593fa83623bcdec16a2da"} Oct 01 10:30:06 crc kubenswrapper[4735]: I1001 10:30:06.057856 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6qncp" podStartSLOduration=2.559446436 podStartE2EDuration="6.057841539s" podCreationTimestamp="2025-10-01 10:30:00 +0000 UTC" firstStartedPulling="2025-10-01 10:30:01.98631645 +0000 UTC m=+760.679137712" lastFinishedPulling="2025-10-01 10:30:05.484711543 +0000 UTC m=+764.177532815" observedRunningTime="2025-10-01 10:30:06.055576889 +0000 UTC m=+764.748398151" watchObservedRunningTime="2025-10-01 10:30:06.057841539 +0000 UTC m=+764.750662801" Oct 01 10:30:06 crc kubenswrapper[4735]: I1001 10:30:06.355549 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-669s7"] Oct 01 10:30:07 crc kubenswrapper[4735]: I1001 10:30:07.506725 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:30:07 crc kubenswrapper[4735]: I1001 10:30:07.507071 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:30:07 crc kubenswrapper[4735]: I1001 10:30:07.560448 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:30:08 crc kubenswrapper[4735]: I1001 10:30:08.100280 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:30:08 crc kubenswrapper[4735]: W1001 10:30:08.423967 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0efed0_2b56_4ee2_ba0e_8a34a4a1e00b.slice/crio-a0325d26fad3f9e7c6dc407c61fd976a16a5d8fd713f90dce977e1762f1d373c WatchSource:0}: Error finding container a0325d26fad3f9e7c6dc407c61fd976a16a5d8fd713f90dce977e1762f1d373c: Status 404 returned error can't find the container with id a0325d26fad3f9e7c6dc407c61fd976a16a5d8fd713f90dce977e1762f1d373c Oct 01 10:30:09 crc kubenswrapper[4735]: I1001 10:30:09.064626 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5" event={"ID":"12603a12-a744-42f1-b0fd-e1a88e490d81","Type":"ContainerStarted","Data":"93641b6312f28407a62caeab7462f05c834875b64cc58ac6173937bd658bb5f0"} Oct 01 10:30:09 crc kubenswrapper[4735]: I1001 10:30:09.065970 4735 generic.go:334] "Generic (PLEG): container finished" podID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" containerID="722da8612fbf4d5f8c486295d8c609e7d42094f2daa9b6127f57ee6f6a38b58b" exitCode=0 Oct 01 10:30:09 crc kubenswrapper[4735]: I1001 10:30:09.066065 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-669s7" event={"ID":"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b","Type":"ContainerDied","Data":"722da8612fbf4d5f8c486295d8c609e7d42094f2daa9b6127f57ee6f6a38b58b"} Oct 01 10:30:09 crc kubenswrapper[4735]: I1001 10:30:09.066092 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-669s7" event={"ID":"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b","Type":"ContainerStarted","Data":"a0325d26fad3f9e7c6dc407c61fd976a16a5d8fd713f90dce977e1762f1d373c"} Oct 01 10:30:09 crc kubenswrapper[4735]: I1001 10:30:09.370140 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rv2l"] Oct 01 10:30:10 crc kubenswrapper[4735]: I1001 10:30:10.072519 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4rv2l" podUID="51fd9d7c-246d-4279-a6ea-82bc5e697058" containerName="registry-server" containerID="cri-o://386433bb6b314eabb9a5c9c7c257ac35ec82881748b82fda4937b6e5006c43d5" gracePeriod=2 Oct 01 10:30:11 crc kubenswrapper[4735]: I1001 10:30:11.079206 4735 generic.go:334] "Generic (PLEG): container finished" podID="51fd9d7c-246d-4279-a6ea-82bc5e697058" containerID="386433bb6b314eabb9a5c9c7c257ac35ec82881748b82fda4937b6e5006c43d5" exitCode=0 Oct 01 10:30:11 crc kubenswrapper[4735]: I1001 10:30:11.079240 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rv2l" event={"ID":"51fd9d7c-246d-4279-a6ea-82bc5e697058","Type":"ContainerDied","Data":"386433bb6b314eabb9a5c9c7c257ac35ec82881748b82fda4937b6e5006c43d5"} Oct 01 10:30:11 crc kubenswrapper[4735]: I1001 10:30:11.104620 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:11 crc kubenswrapper[4735]: I1001 10:30:11.104692 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:11 crc kubenswrapper[4735]: I1001 10:30:11.144713 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:11 crc kubenswrapper[4735]: I1001 10:30:11.970816 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.064423 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-catalog-content\") pod \"51fd9d7c-246d-4279-a6ea-82bc5e697058\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.064549 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-utilities\") pod \"51fd9d7c-246d-4279-a6ea-82bc5e697058\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.064586 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsfft\" (UniqueName: \"kubernetes.io/projected/51fd9d7c-246d-4279-a6ea-82bc5e697058-kube-api-access-qsfft\") pod \"51fd9d7c-246d-4279-a6ea-82bc5e697058\" (UID: \"51fd9d7c-246d-4279-a6ea-82bc5e697058\") " Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.065651 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-utilities" (OuterVolumeSpecName: "utilities") pod "51fd9d7c-246d-4279-a6ea-82bc5e697058" (UID: "51fd9d7c-246d-4279-a6ea-82bc5e697058"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.073058 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51fd9d7c-246d-4279-a6ea-82bc5e697058-kube-api-access-qsfft" (OuterVolumeSpecName: "kube-api-access-qsfft") pod "51fd9d7c-246d-4279-a6ea-82bc5e697058" (UID: "51fd9d7c-246d-4279-a6ea-82bc5e697058"). InnerVolumeSpecName "kube-api-access-qsfft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.091953 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rv2l" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.092244 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rv2l" event={"ID":"51fd9d7c-246d-4279-a6ea-82bc5e697058","Type":"ContainerDied","Data":"228469ecb4653ce951daf6858ca6f1b9a611eb5f3a67e426a82257573f5d373a"} Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.092293 4735 scope.go:117] "RemoveContainer" containerID="386433bb6b314eabb9a5c9c7c257ac35ec82881748b82fda4937b6e5006c43d5" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.125411 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51fd9d7c-246d-4279-a6ea-82bc5e697058" (UID: "51fd9d7c-246d-4279-a6ea-82bc5e697058"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.138164 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.142443 4735 scope.go:117] "RemoveContainer" containerID="9ecc5eed5aa7805a2818392d5baded129936a3562841087f98a6179dbf99eb96" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.166320 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.166368 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51fd9d7c-246d-4279-a6ea-82bc5e697058-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.166384 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsfft\" (UniqueName: \"kubernetes.io/projected/51fd9d7c-246d-4279-a6ea-82bc5e697058-kube-api-access-qsfft\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.305507 4735 scope.go:117] "RemoveContainer" containerID="b3d638c133c224db8df57588c6eaac7106874f75eada69d43f09be63891627f2" Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.430542 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rv2l"] Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.439388 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4rv2l"] Oct 01 10:30:12 crc kubenswrapper[4735]: I1001 10:30:12.771257 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6qncp"] Oct 01 10:30:13 crc kubenswrapper[4735]: I1001 10:30:13.102716 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5" event={"ID":"12603a12-a744-42f1-b0fd-e1a88e490d81","Type":"ContainerStarted","Data":"e0e357235a01fbf152f5a2230692692d150174d42be90c27be3143945ec8a746"} Oct 01 10:30:13 crc kubenswrapper[4735]: I1001 10:30:13.103069 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5" Oct 01 10:30:13 crc kubenswrapper[4735]: I1001 10:30:13.105778 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5" Oct 01 10:30:13 crc kubenswrapper[4735]: I1001 10:30:13.105949 4735 generic.go:334] "Generic (PLEG): container finished" podID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" containerID="d941345b7cc826af4dc673bc94f82eeac07a12967c8ebe0953d13c09c3be82cd" exitCode=0 Oct 01 10:30:13 crc kubenswrapper[4735]: I1001 10:30:13.106142 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-669s7" event={"ID":"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b","Type":"ContainerDied","Data":"d941345b7cc826af4dc673bc94f82eeac07a12967c8ebe0953d13c09c3be82cd"} Oct 01 10:30:13 crc kubenswrapper[4735]: I1001 10:30:13.137442 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-794c859bbc-8kzp5" podStartSLOduration=1.453952395 podStartE2EDuration="9.137422507s" podCreationTimestamp="2025-10-01 10:30:04 +0000 UTC" firstStartedPulling="2025-10-01 10:30:05.143779284 +0000 UTC m=+763.836600566" lastFinishedPulling="2025-10-01 10:30:12.827249416 +0000 UTC m=+771.520070678" observedRunningTime="2025-10-01 10:30:13.135736643 +0000 UTC m=+771.828557905" watchObservedRunningTime="2025-10-01 10:30:13.137422507 +0000 UTC m=+771.830243779" Oct 01 10:30:13 crc kubenswrapper[4735]: I1001 10:30:13.904222 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51fd9d7c-246d-4279-a6ea-82bc5e697058" path="/var/lib/kubelet/pods/51fd9d7c-246d-4279-a6ea-82bc5e697058/volumes" Oct 01 10:30:14 crc kubenswrapper[4735]: I1001 10:30:14.112551 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6qncp" podUID="b6a1b789-fd48-4452-be68-f5d108f5f208" containerName="registry-server" containerID="cri-o://bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35" gracePeriod=2 Oct 01 10:30:14 crc kubenswrapper[4735]: I1001 10:30:14.470554 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:14 crc kubenswrapper[4735]: I1001 10:30:14.597011 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-catalog-content\") pod \"b6a1b789-fd48-4452-be68-f5d108f5f208\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " Oct 01 10:30:14 crc kubenswrapper[4735]: I1001 10:30:14.597131 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm5lc\" (UniqueName: \"kubernetes.io/projected/b6a1b789-fd48-4452-be68-f5d108f5f208-kube-api-access-sm5lc\") pod \"b6a1b789-fd48-4452-be68-f5d108f5f208\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " Oct 01 10:30:14 crc kubenswrapper[4735]: I1001 10:30:14.597204 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-utilities\") pod \"b6a1b789-fd48-4452-be68-f5d108f5f208\" (UID: \"b6a1b789-fd48-4452-be68-f5d108f5f208\") " Oct 01 10:30:14 crc kubenswrapper[4735]: I1001 10:30:14.598250 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-utilities" (OuterVolumeSpecName: "utilities") pod "b6a1b789-fd48-4452-be68-f5d108f5f208" (UID: "b6a1b789-fd48-4452-be68-f5d108f5f208"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:30:14 crc kubenswrapper[4735]: I1001 10:30:14.606516 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a1b789-fd48-4452-be68-f5d108f5f208-kube-api-access-sm5lc" (OuterVolumeSpecName: "kube-api-access-sm5lc") pod "b6a1b789-fd48-4452-be68-f5d108f5f208" (UID: "b6a1b789-fd48-4452-be68-f5d108f5f208"). InnerVolumeSpecName "kube-api-access-sm5lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:30:14 crc kubenswrapper[4735]: I1001 10:30:14.699185 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:14 crc kubenswrapper[4735]: I1001 10:30:14.699218 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm5lc\" (UniqueName: \"kubernetes.io/projected/b6a1b789-fd48-4452-be68-f5d108f5f208-kube-api-access-sm5lc\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.121181 4735 generic.go:334] "Generic (PLEG): container finished" podID="b6a1b789-fd48-4452-be68-f5d108f5f208" containerID="bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35" exitCode=0 Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.121226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qncp" event={"ID":"b6a1b789-fd48-4452-be68-f5d108f5f208","Type":"ContainerDied","Data":"bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35"} Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.121273 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6qncp" event={"ID":"b6a1b789-fd48-4452-be68-f5d108f5f208","Type":"ContainerDied","Data":"917bf69a6b91681315cc30b4e38103d4ae645e0aec6e8868fc87f035359ca384"} Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.121280 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6qncp" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.121337 4735 scope.go:117] "RemoveContainer" containerID="bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.125359 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-669s7" event={"ID":"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b","Type":"ContainerStarted","Data":"c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493"} Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.141171 4735 scope.go:117] "RemoveContainer" containerID="34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.152065 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-669s7" podStartSLOduration=5.17643842 podStartE2EDuration="10.152035029s" podCreationTimestamp="2025-10-01 10:30:05 +0000 UTC" firstStartedPulling="2025-10-01 10:30:09.067709296 +0000 UTC m=+767.760530578" lastFinishedPulling="2025-10-01 10:30:14.043305935 +0000 UTC m=+772.736127187" observedRunningTime="2025-10-01 10:30:15.146114981 +0000 UTC m=+773.838936303" watchObservedRunningTime="2025-10-01 10:30:15.152035029 +0000 UTC m=+773.844856341" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.160869 4735 scope.go:117] "RemoveContainer" containerID="8cdd3eee68cedd2ef188fb2a40ff8838351bef1873eff608d4ff340531d44683" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.189675 4735 scope.go:117] "RemoveContainer" containerID="bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35" Oct 01 10:30:15 crc kubenswrapper[4735]: E1001 10:30:15.191039 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35\": container with ID starting with bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35 not found: ID does not exist" containerID="bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.191092 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35"} err="failed to get container status \"bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35\": rpc error: code = NotFound desc = could not find container \"bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35\": container with ID starting with bb2057353a7443ec796661bc496f8610059971e630cbd659c97be4aada24fd35 not found: ID does not exist" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.191124 4735 scope.go:117] "RemoveContainer" containerID="34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6" Oct 01 10:30:15 crc kubenswrapper[4735]: E1001 10:30:15.191918 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6\": container with ID starting with 34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6 not found: ID does not exist" containerID="34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.191952 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6"} err="failed to get container status \"34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6\": rpc error: code = NotFound desc = could not find container \"34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6\": container with ID starting with 34735302f05e365bb11b738fffd0b3cd432997f8d52acd4e9a339217bbe3cdb6 not found: ID does not exist" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.191973 4735 scope.go:117] "RemoveContainer" containerID="8cdd3eee68cedd2ef188fb2a40ff8838351bef1873eff608d4ff340531d44683" Oct 01 10:30:15 crc kubenswrapper[4735]: E1001 10:30:15.192661 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cdd3eee68cedd2ef188fb2a40ff8838351bef1873eff608d4ff340531d44683\": container with ID starting with 8cdd3eee68cedd2ef188fb2a40ff8838351bef1873eff608d4ff340531d44683 not found: ID does not exist" containerID="8cdd3eee68cedd2ef188fb2a40ff8838351bef1873eff608d4ff340531d44683" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.192703 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cdd3eee68cedd2ef188fb2a40ff8838351bef1873eff608d4ff340531d44683"} err="failed to get container status \"8cdd3eee68cedd2ef188fb2a40ff8838351bef1873eff608d4ff340531d44683\": rpc error: code = NotFound desc = could not find container \"8cdd3eee68cedd2ef188fb2a40ff8838351bef1873eff608d4ff340531d44683\": container with ID starting with 8cdd3eee68cedd2ef188fb2a40ff8838351bef1873eff608d4ff340531d44683 not found: ID does not exist" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.914464 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.915028 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:15 crc kubenswrapper[4735]: I1001 10:30:15.961821 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:16 crc kubenswrapper[4735]: I1001 10:30:16.287173 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6a1b789-fd48-4452-be68-f5d108f5f208" (UID: "b6a1b789-fd48-4452-be68-f5d108f5f208"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:30:16 crc kubenswrapper[4735]: I1001 10:30:16.321875 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a1b789-fd48-4452-be68-f5d108f5f208-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:16 crc kubenswrapper[4735]: I1001 10:30:16.349611 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6qncp"] Oct 01 10:30:16 crc kubenswrapper[4735]: I1001 10:30:16.353229 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6qncp"] Oct 01 10:30:17 crc kubenswrapper[4735]: I1001 10:30:17.906421 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a1b789-fd48-4452-be68-f5d108f5f208" path="/var/lib/kubelet/pods/b6a1b789-fd48-4452-be68-f5d108f5f208/volumes" Oct 01 10:30:25 crc kubenswrapper[4735]: I1001 10:30:25.967660 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:28 crc kubenswrapper[4735]: I1001 10:30:28.770949 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-669s7"] Oct 01 10:30:28 crc kubenswrapper[4735]: I1001 10:30:28.771422 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-669s7" podUID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" containerName="registry-server" containerID="cri-o://c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493" gracePeriod=2 Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.168255 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.219018 4735 generic.go:334] "Generic (PLEG): container finished" podID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" containerID="c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493" exitCode=0 Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.219059 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-669s7" event={"ID":"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b","Type":"ContainerDied","Data":"c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493"} Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.219084 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-669s7" event={"ID":"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b","Type":"ContainerDied","Data":"a0325d26fad3f9e7c6dc407c61fd976a16a5d8fd713f90dce977e1762f1d373c"} Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.219102 4735 scope.go:117] "RemoveContainer" containerID="c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.219212 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-669s7" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.243713 4735 scope.go:117] "RemoveContainer" containerID="d941345b7cc826af4dc673bc94f82eeac07a12967c8ebe0953d13c09c3be82cd" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.262007 4735 scope.go:117] "RemoveContainer" containerID="722da8612fbf4d5f8c486295d8c609e7d42094f2daa9b6127f57ee6f6a38b58b" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.292298 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bt4j\" (UniqueName: \"kubernetes.io/projected/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-kube-api-access-9bt4j\") pod \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.292397 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-utilities\") pod \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.292416 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-catalog-content\") pod \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\" (UID: \"5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b\") " Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.294484 4735 scope.go:117] "RemoveContainer" containerID="c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.295334 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-utilities" (OuterVolumeSpecName: "utilities") pod "5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" (UID: "5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:30:29 crc kubenswrapper[4735]: E1001 10:30:29.296065 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493\": container with ID starting with c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493 not found: ID does not exist" containerID="c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.296106 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493"} err="failed to get container status \"c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493\": rpc error: code = NotFound desc = could not find container \"c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493\": container with ID starting with c7021bd22cefd8830db86739df6d5fd6ac0032eb982205f23ac1ee5abe822493 not found: ID does not exist" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.296130 4735 scope.go:117] "RemoveContainer" containerID="d941345b7cc826af4dc673bc94f82eeac07a12967c8ebe0953d13c09c3be82cd" Oct 01 10:30:29 crc kubenswrapper[4735]: E1001 10:30:29.296469 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d941345b7cc826af4dc673bc94f82eeac07a12967c8ebe0953d13c09c3be82cd\": container with ID starting with d941345b7cc826af4dc673bc94f82eeac07a12967c8ebe0953d13c09c3be82cd not found: ID does not exist" containerID="d941345b7cc826af4dc673bc94f82eeac07a12967c8ebe0953d13c09c3be82cd" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.296511 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d941345b7cc826af4dc673bc94f82eeac07a12967c8ebe0953d13c09c3be82cd"} err="failed to get container status \"d941345b7cc826af4dc673bc94f82eeac07a12967c8ebe0953d13c09c3be82cd\": rpc error: code = NotFound desc = could not find container \"d941345b7cc826af4dc673bc94f82eeac07a12967c8ebe0953d13c09c3be82cd\": container with ID starting with d941345b7cc826af4dc673bc94f82eeac07a12967c8ebe0953d13c09c3be82cd not found: ID does not exist" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.296525 4735 scope.go:117] "RemoveContainer" containerID="722da8612fbf4d5f8c486295d8c609e7d42094f2daa9b6127f57ee6f6a38b58b" Oct 01 10:30:29 crc kubenswrapper[4735]: E1001 10:30:29.297778 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"722da8612fbf4d5f8c486295d8c609e7d42094f2daa9b6127f57ee6f6a38b58b\": container with ID starting with 722da8612fbf4d5f8c486295d8c609e7d42094f2daa9b6127f57ee6f6a38b58b not found: ID does not exist" containerID="722da8612fbf4d5f8c486295d8c609e7d42094f2daa9b6127f57ee6f6a38b58b" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.297817 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722da8612fbf4d5f8c486295d8c609e7d42094f2daa9b6127f57ee6f6a38b58b"} err="failed to get container status \"722da8612fbf4d5f8c486295d8c609e7d42094f2daa9b6127f57ee6f6a38b58b\": rpc error: code = NotFound desc = could not find container \"722da8612fbf4d5f8c486295d8c609e7d42094f2daa9b6127f57ee6f6a38b58b\": container with ID starting with 722da8612fbf4d5f8c486295d8c609e7d42094f2daa9b6127f57ee6f6a38b58b not found: ID does not exist" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.300837 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-kube-api-access-9bt4j" (OuterVolumeSpecName: "kube-api-access-9bt4j") pod "5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" (UID: "5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b"). InnerVolumeSpecName "kube-api-access-9bt4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.305655 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" (UID: "5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.393464 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bt4j\" (UniqueName: \"kubernetes.io/projected/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-kube-api-access-9bt4j\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.393521 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.393534 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.544436 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-669s7"] Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.551897 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-669s7"] Oct 01 10:30:29 crc kubenswrapper[4735]: I1001 10:30:29.908292 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" path="/var/lib/kubelet/pods/5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b/volumes" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.598339 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl"] Oct 01 10:30:30 crc kubenswrapper[4735]: E1001 10:30:30.598844 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a1b789-fd48-4452-be68-f5d108f5f208" containerName="extract-content" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.598857 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a1b789-fd48-4452-be68-f5d108f5f208" containerName="extract-content" Oct 01 10:30:30 crc kubenswrapper[4735]: E1001 10:30:30.598865 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" containerName="extract-utilities" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.598872 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" containerName="extract-utilities" Oct 01 10:30:30 crc kubenswrapper[4735]: E1001 10:30:30.598884 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51fd9d7c-246d-4279-a6ea-82bc5e697058" containerName="extract-content" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.598890 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="51fd9d7c-246d-4279-a6ea-82bc5e697058" containerName="extract-content" Oct 01 10:30:30 crc kubenswrapper[4735]: E1001 10:30:30.598900 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" containerName="registry-server" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.598907 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" containerName="registry-server" Oct 01 10:30:30 crc kubenswrapper[4735]: E1001 10:30:30.598917 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a1b789-fd48-4452-be68-f5d108f5f208" containerName="registry-server" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.598923 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a1b789-fd48-4452-be68-f5d108f5f208" containerName="registry-server" Oct 01 10:30:30 crc kubenswrapper[4735]: E1001 10:30:30.598930 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51fd9d7c-246d-4279-a6ea-82bc5e697058" containerName="registry-server" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.598935 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="51fd9d7c-246d-4279-a6ea-82bc5e697058" containerName="registry-server" Oct 01 10:30:30 crc kubenswrapper[4735]: E1001 10:30:30.598946 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" containerName="extract-content" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.598951 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" containerName="extract-content" Oct 01 10:30:30 crc kubenswrapper[4735]: E1001 10:30:30.598962 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51fd9d7c-246d-4279-a6ea-82bc5e697058" containerName="extract-utilities" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.598967 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="51fd9d7c-246d-4279-a6ea-82bc5e697058" containerName="extract-utilities" Oct 01 10:30:30 crc kubenswrapper[4735]: E1001 10:30:30.598973 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a1b789-fd48-4452-be68-f5d108f5f208" containerName="extract-utilities" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.598978 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a1b789-fd48-4452-be68-f5d108f5f208" containerName="extract-utilities" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.599069 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a0efed0-2b56-4ee2-ba0e-8a34a4a1e00b" containerName="registry-server" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.599084 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="51fd9d7c-246d-4279-a6ea-82bc5e697058" containerName="registry-server" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.599094 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a1b789-fd48-4452-be68-f5d108f5f208" containerName="registry-server" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.599639 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.602004 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-c54wj" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.617601 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.618505 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.620084 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-g8rk2" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.622871 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.631211 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.654311 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.655205 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.660437 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-vrbzw" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.684628 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.700227 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.701375 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.704811 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-spqvd" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.710959 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5jcj\" (UniqueName: \"kubernetes.io/projected/db6d8b12-0a21-40a5-b23a-e943494a2091-kube-api-access-g5jcj\") pod \"cinder-operator-controller-manager-644bddb6d8-l5lmn\" (UID: \"db6d8b12-0a21-40a5-b23a-e943494a2091\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.711010 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd9qm\" (UniqueName: \"kubernetes.io/projected/27b0c660-ca46-48cb-88ca-bb5715532c80-kube-api-access-jd9qm\") pod \"barbican-operator-controller-manager-6ff8b75857-v88gl\" (UID: \"27b0c660-ca46-48cb-88ca-bb5715532c80\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.737832 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.738843 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.742449 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-f96rt" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.746758 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.755843 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.763125 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.764023 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.775861 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-fkwcz" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.782583 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.793732 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.794738 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.799564 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wpgzv" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.799769 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.812690 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c425z\" (UniqueName: \"kubernetes.io/projected/263d49ed-e577-457b-b887-33f95f1bbed0-kube-api-access-c425z\") pod \"designate-operator-controller-manager-84f4f7b77b-fj5j6\" (UID: \"263d49ed-e577-457b-b887-33f95f1bbed0\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.812762 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5jcj\" (UniqueName: \"kubernetes.io/projected/db6d8b12-0a21-40a5-b23a-e943494a2091-kube-api-access-g5jcj\") pod \"cinder-operator-controller-manager-644bddb6d8-l5lmn\" (UID: \"db6d8b12-0a21-40a5-b23a-e943494a2091\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.812802 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd9qm\" (UniqueName: \"kubernetes.io/projected/27b0c660-ca46-48cb-88ca-bb5715532c80-kube-api-access-jd9qm\") pod \"barbican-operator-controller-manager-6ff8b75857-v88gl\" (UID: \"27b0c660-ca46-48cb-88ca-bb5715532c80\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.812823 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dwgj\" (UniqueName: \"kubernetes.io/projected/bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98-kube-api-access-4dwgj\") pod \"glance-operator-controller-manager-84958c4d49-hgmvf\" (UID: \"bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.823882 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.824818 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.832368 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-dfrs8" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.832914 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.854358 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5jcj\" (UniqueName: \"kubernetes.io/projected/db6d8b12-0a21-40a5-b23a-e943494a2091-kube-api-access-g5jcj\") pod \"cinder-operator-controller-manager-644bddb6d8-l5lmn\" (UID: \"db6d8b12-0a21-40a5-b23a-e943494a2091\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.854402 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd9qm\" (UniqueName: \"kubernetes.io/projected/27b0c660-ca46-48cb-88ca-bb5715532c80-kube-api-access-jd9qm\") pod \"barbican-operator-controller-manager-6ff8b75857-v88gl\" (UID: \"27b0c660-ca46-48cb-88ca-bb5715532c80\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.860810 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.874643 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.875600 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.881583 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.882617 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.883830 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6rjtt" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.888853 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-z6dcb" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.902554 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.915325 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.916205 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a48cc547-6994-48eb-b8db-9682c091fdac-cert\") pod \"infra-operator-controller-manager-9d6c5db85-txp8d\" (UID: \"a48cc547-6994-48eb-b8db-9682c091fdac\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.916250 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dwgj\" (UniqueName: \"kubernetes.io/projected/bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98-kube-api-access-4dwgj\") pod \"glance-operator-controller-manager-84958c4d49-hgmvf\" (UID: \"bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.916272 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkl8d\" (UniqueName: \"kubernetes.io/projected/c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9-kube-api-access-wkl8d\") pod \"heat-operator-controller-manager-5d889d78cf-qp9c7\" (UID: \"c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.916302 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c425z\" (UniqueName: \"kubernetes.io/projected/263d49ed-e577-457b-b887-33f95f1bbed0-kube-api-access-c425z\") pod \"designate-operator-controller-manager-84f4f7b77b-fj5j6\" (UID: \"263d49ed-e577-457b-b887-33f95f1bbed0\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.916351 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-828x5\" (UniqueName: \"kubernetes.io/projected/a48cc547-6994-48eb-b8db-9682c091fdac-kube-api-access-828x5\") pod \"infra-operator-controller-manager-9d6c5db85-txp8d\" (UID: \"a48cc547-6994-48eb-b8db-9682c091fdac\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.916379 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lhf7\" (UniqueName: \"kubernetes.io/projected/8a30378d-5948-4a39-b5ec-85b29f5763e9-kube-api-access-5lhf7\") pod \"ironic-operator-controller-manager-5cd4858477-hw4lp\" (UID: \"8a30378d-5948-4a39-b5ec-85b29f5763e9\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.916409 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62znn\" (UniqueName: \"kubernetes.io/projected/7df184a1-eb46-4e19-85cd-82e8d5da1880-kube-api-access-62znn\") pod \"horizon-operator-controller-manager-9f4696d94-zsvgx\" (UID: \"7df184a1-eb46-4e19-85cd-82e8d5da1880\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.918594 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.922066 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.926412 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.926923 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.928052 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.928615 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-2bj47" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.939847 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xjfmc" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.940157 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.947754 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.950661 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.951915 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.959065 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c425z\" (UniqueName: \"kubernetes.io/projected/263d49ed-e577-457b-b887-33f95f1bbed0-kube-api-access-c425z\") pod \"designate-operator-controller-manager-84f4f7b77b-fj5j6\" (UID: \"263d49ed-e577-457b-b887-33f95f1bbed0\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.960341 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-4fv68" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.964554 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.975214 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dwgj\" (UniqueName: \"kubernetes.io/projected/bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98-kube-api-access-4dwgj\") pod \"glance-operator-controller-manager-84958c4d49-hgmvf\" (UID: \"bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.977426 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.978425 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.994218 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.994269 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq"] Oct 01 10:30:30 crc kubenswrapper[4735]: I1001 10:30:30.995964 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-cpgg4" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.018518 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkl8d\" (UniqueName: \"kubernetes.io/projected/c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9-kube-api-access-wkl8d\") pod \"heat-operator-controller-manager-5d889d78cf-qp9c7\" (UID: \"c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.018614 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-828x5\" (UniqueName: \"kubernetes.io/projected/a48cc547-6994-48eb-b8db-9682c091fdac-kube-api-access-828x5\") pod \"infra-operator-controller-manager-9d6c5db85-txp8d\" (UID: \"a48cc547-6994-48eb-b8db-9682c091fdac\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.018649 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd8r7\" (UniqueName: \"kubernetes.io/projected/912f537c-db9b-4256-a5e0-81dc33bcaf3e-kube-api-access-hd8r7\") pod \"mariadb-operator-controller-manager-88c7-m5vc9\" (UID: \"912f537c-db9b-4256-a5e0-81dc33bcaf3e\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.018680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lhf7\" (UniqueName: \"kubernetes.io/projected/8a30378d-5948-4a39-b5ec-85b29f5763e9-kube-api-access-5lhf7\") pod \"ironic-operator-controller-manager-5cd4858477-hw4lp\" (UID: \"8a30378d-5948-4a39-b5ec-85b29f5763e9\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.018708 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62znn\" (UniqueName: \"kubernetes.io/projected/7df184a1-eb46-4e19-85cd-82e8d5da1880-kube-api-access-62znn\") pod \"horizon-operator-controller-manager-9f4696d94-zsvgx\" (UID: \"7df184a1-eb46-4e19-85cd-82e8d5da1880\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.018724 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9rn4\" (UniqueName: \"kubernetes.io/projected/663344ae-dd00-416c-9120-d4f0721554b4-kube-api-access-v9rn4\") pod \"neutron-operator-controller-manager-849d5b9b84-9k2wg\" (UID: \"663344ae-dd00-416c-9120-d4f0721554b4\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.018745 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfwvv\" (UniqueName: \"kubernetes.io/projected/628c079b-418b-4299-9394-a59ab2850d23-kube-api-access-jfwvv\") pod \"keystone-operator-controller-manager-5bd55b4bff-wftzg\" (UID: \"628c079b-418b-4299-9394-a59ab2850d23\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.018767 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbxkl\" (UniqueName: \"kubernetes.io/projected/a1e6fcf2-bfad-48fe-b655-0e9199818230-kube-api-access-sbxkl\") pod \"manila-operator-controller-manager-6d68dbc695-7hcv5\" (UID: \"a1e6fcf2-bfad-48fe-b655-0e9199818230\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.018808 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a48cc547-6994-48eb-b8db-9682c091fdac-cert\") pod \"infra-operator-controller-manager-9d6c5db85-txp8d\" (UID: \"a48cc547-6994-48eb-b8db-9682c091fdac\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" Oct 01 10:30:31 crc kubenswrapper[4735]: E1001 10:30:31.018914 4735 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 01 10:30:31 crc kubenswrapper[4735]: E1001 10:30:31.018968 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a48cc547-6994-48eb-b8db-9682c091fdac-cert podName:a48cc547-6994-48eb-b8db-9682c091fdac nodeName:}" failed. No retries permitted until 2025-10-01 10:30:31.518952245 +0000 UTC m=+790.211773507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a48cc547-6994-48eb-b8db-9682c091fdac-cert") pod "infra-operator-controller-manager-9d6c5db85-txp8d" (UID: "a48cc547-6994-48eb-b8db-9682c091fdac") : secret "infra-operator-webhook-server-cert" not found Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.019065 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.044627 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.044926 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.054528 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.066861 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.095858 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-pbtr4" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.114782 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lhf7\" (UniqueName: \"kubernetes.io/projected/8a30378d-5948-4a39-b5ec-85b29f5763e9-kube-api-access-5lhf7\") pod \"ironic-operator-controller-manager-5cd4858477-hw4lp\" (UID: \"8a30378d-5948-4a39-b5ec-85b29f5763e9\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.120314 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd8r7\" (UniqueName: \"kubernetes.io/projected/912f537c-db9b-4256-a5e0-81dc33bcaf3e-kube-api-access-hd8r7\") pod \"mariadb-operator-controller-manager-88c7-m5vc9\" (UID: \"912f537c-db9b-4256-a5e0-81dc33bcaf3e\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.120347 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdhdt\" (UniqueName: \"kubernetes.io/projected/6b1d4e62-3eb6-4090-826d-e627e08c73c6-kube-api-access-zdhdt\") pod \"octavia-operator-controller-manager-7b787867f4-lg9mq\" (UID: \"6b1d4e62-3eb6-4090-826d-e627e08c73c6\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.121515 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9rn4\" (UniqueName: \"kubernetes.io/projected/663344ae-dd00-416c-9120-d4f0721554b4-kube-api-access-v9rn4\") pod \"neutron-operator-controller-manager-849d5b9b84-9k2wg\" (UID: \"663344ae-dd00-416c-9120-d4f0721554b4\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.121542 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd4rz\" (UniqueName: \"kubernetes.io/projected/3fd0c004-178e-41cb-be27-2d2342d9f58c-kube-api-access-fd4rz\") pod \"nova-operator-controller-manager-64cd67b5cb-mr6vv\" (UID: \"3fd0c004-178e-41cb-be27-2d2342d9f58c\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.121562 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfwvv\" (UniqueName: \"kubernetes.io/projected/628c079b-418b-4299-9394-a59ab2850d23-kube-api-access-jfwvv\") pod \"keystone-operator-controller-manager-5bd55b4bff-wftzg\" (UID: \"628c079b-418b-4299-9394-a59ab2850d23\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.121585 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbxkl\" (UniqueName: \"kubernetes.io/projected/a1e6fcf2-bfad-48fe-b655-0e9199818230-kube-api-access-sbxkl\") pod \"manila-operator-controller-manager-6d68dbc695-7hcv5\" (UID: \"a1e6fcf2-bfad-48fe-b655-0e9199818230\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.122692 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.123834 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.126550 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-82q9w" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.133410 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.134590 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-828x5\" (UniqueName: \"kubernetes.io/projected/a48cc547-6994-48eb-b8db-9682c091fdac-kube-api-access-828x5\") pod \"infra-operator-controller-manager-9d6c5db85-txp8d\" (UID: \"a48cc547-6994-48eb-b8db-9682c091fdac\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.135230 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkl8d\" (UniqueName: \"kubernetes.io/projected/c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9-kube-api-access-wkl8d\") pod \"heat-operator-controller-manager-5d889d78cf-qp9c7\" (UID: \"c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.138314 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62znn\" (UniqueName: \"kubernetes.io/projected/7df184a1-eb46-4e19-85cd-82e8d5da1880-kube-api-access-62znn\") pod \"horizon-operator-controller-manager-9f4696d94-zsvgx\" (UID: \"7df184a1-eb46-4e19-85cd-82e8d5da1880\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.147970 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.148018 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.148948 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.149768 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.150445 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.156718 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.157972 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.167252 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbxkl\" (UniqueName: \"kubernetes.io/projected/a1e6fcf2-bfad-48fe-b655-0e9199818230-kube-api-access-sbxkl\") pod \"manila-operator-controller-manager-6d68dbc695-7hcv5\" (UID: \"a1e6fcf2-bfad-48fe-b655-0e9199818230\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.167591 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-t5bsj" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.167911 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-24m9k" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.168092 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-gvgqr" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.168210 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.168285 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.173624 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.177289 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd8r7\" (UniqueName: \"kubernetes.io/projected/912f537c-db9b-4256-a5e0-81dc33bcaf3e-kube-api-access-hd8r7\") pod \"mariadb-operator-controller-manager-88c7-m5vc9\" (UID: \"912f537c-db9b-4256-a5e0-81dc33bcaf3e\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.182094 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9rn4\" (UniqueName: \"kubernetes.io/projected/663344ae-dd00-416c-9120-d4f0721554b4-kube-api-access-v9rn4\") pod \"neutron-operator-controller-manager-849d5b9b84-9k2wg\" (UID: \"663344ae-dd00-416c-9120-d4f0721554b4\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.184796 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfwvv\" (UniqueName: \"kubernetes.io/projected/628c079b-418b-4299-9394-a59ab2850d23-kube-api-access-jfwvv\") pod \"keystone-operator-controller-manager-5bd55b4bff-wftzg\" (UID: \"628c079b-418b-4299-9394-a59ab2850d23\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.188633 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.191911 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-5w5cv"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.193142 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-5w5cv" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.198591 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-rcnkt" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.211697 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-5w5cv"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.216762 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.218182 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.222802 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.222874 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q9l9\" (UniqueName: \"kubernetes.io/projected/9fba9aa7-25bd-4d48-89e9-818af62e38af-kube-api-access-6q9l9\") pod \"placement-operator-controller-manager-589c58c6c-p8c9g\" (UID: \"9fba9aa7-25bd-4d48-89e9-818af62e38af\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.222908 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jvq8\" (UniqueName: \"kubernetes.io/projected/ef40c559-9b0d-478a-baa4-239ab6d71d76-kube-api-access-7jvq8\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cllzwc\" (UID: \"ef40c559-9b0d-478a-baa4-239ab6d71d76\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.222954 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qqwn\" (UniqueName: \"kubernetes.io/projected/82c5ecde-4ff5-42bc-9956-45d025d53f45-kube-api-access-7qqwn\") pod \"telemetry-operator-controller-manager-b8d54b5d7-svvjg\" (UID: \"82c5ecde-4ff5-42bc-9956-45d025d53f45\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.222982 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brmdb\" (UniqueName: \"kubernetes.io/projected/50560653-c3f8-4fa8-9d19-a1525b1daaa2-kube-api-access-brmdb\") pod \"swift-operator-controller-manager-84d6b4b759-j6dlz\" (UID: \"50560653-c3f8-4fa8-9d19-a1525b1daaa2\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.223001 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdhdt\" (UniqueName: \"kubernetes.io/projected/6b1d4e62-3eb6-4090-826d-e627e08c73c6-kube-api-access-zdhdt\") pod \"octavia-operator-controller-manager-7b787867f4-lg9mq\" (UID: \"6b1d4e62-3eb6-4090-826d-e627e08c73c6\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.223029 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6tlw\" (UniqueName: \"kubernetes.io/projected/ef5f656f-602f-475c-8bd2-078c0bb43388-kube-api-access-r6tlw\") pod \"ovn-operator-controller-manager-9976ff44c-2pfnh\" (UID: \"ef5f656f-602f-475c-8bd2-078c0bb43388\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.223048 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd4rz\" (UniqueName: \"kubernetes.io/projected/3fd0c004-178e-41cb-be27-2d2342d9f58c-kube-api-access-fd4rz\") pod \"nova-operator-controller-manager-64cd67b5cb-mr6vv\" (UID: \"3fd0c004-178e-41cb-be27-2d2342d9f58c\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.224623 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef40c559-9b0d-478a-baa4-239ab6d71d76-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cllzwc\" (UID: \"ef40c559-9b0d-478a-baa4-239ab6d71d76\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.227777 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.226263 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-jhfq8" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.243022 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd4rz\" (UniqueName: \"kubernetes.io/projected/3fd0c004-178e-41cb-be27-2d2342d9f58c-kube-api-access-fd4rz\") pod \"nova-operator-controller-manager-64cd67b5cb-mr6vv\" (UID: \"3fd0c004-178e-41cb-be27-2d2342d9f58c\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.243768 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.249346 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdhdt\" (UniqueName: \"kubernetes.io/projected/6b1d4e62-3eb6-4090-826d-e627e08c73c6-kube-api-access-zdhdt\") pod \"octavia-operator-controller-manager-7b787867f4-lg9mq\" (UID: \"6b1d4e62-3eb6-4090-826d-e627e08c73c6\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.328754 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6tlw\" (UniqueName: \"kubernetes.io/projected/ef5f656f-602f-475c-8bd2-078c0bb43388-kube-api-access-r6tlw\") pod \"ovn-operator-controller-manager-9976ff44c-2pfnh\" (UID: \"ef5f656f-602f-475c-8bd2-078c0bb43388\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.328798 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef40c559-9b0d-478a-baa4-239ab6d71d76-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cllzwc\" (UID: \"ef40c559-9b0d-478a-baa4-239ab6d71d76\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.328847 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kptf8\" (UniqueName: \"kubernetes.io/projected/dfaa86bf-1e61-4393-ba0f-b9003fdbde80-kube-api-access-kptf8\") pod \"test-operator-controller-manager-85777745bb-5w5cv\" (UID: \"dfaa86bf-1e61-4393-ba0f-b9003fdbde80\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-5w5cv" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.328868 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nstm\" (UniqueName: \"kubernetes.io/projected/c4ac4b8f-7378-438b-8412-1b74d1c3fda9-kube-api-access-8nstm\") pod \"watcher-operator-controller-manager-6b9957f54f-f29js\" (UID: \"c4ac4b8f-7378-438b-8412-1b74d1c3fda9\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.328891 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q9l9\" (UniqueName: \"kubernetes.io/projected/9fba9aa7-25bd-4d48-89e9-818af62e38af-kube-api-access-6q9l9\") pod \"placement-operator-controller-manager-589c58c6c-p8c9g\" (UID: \"9fba9aa7-25bd-4d48-89e9-818af62e38af\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.328911 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jvq8\" (UniqueName: \"kubernetes.io/projected/ef40c559-9b0d-478a-baa4-239ab6d71d76-kube-api-access-7jvq8\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cllzwc\" (UID: \"ef40c559-9b0d-478a-baa4-239ab6d71d76\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.328955 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qqwn\" (UniqueName: \"kubernetes.io/projected/82c5ecde-4ff5-42bc-9956-45d025d53f45-kube-api-access-7qqwn\") pod \"telemetry-operator-controller-manager-b8d54b5d7-svvjg\" (UID: \"82c5ecde-4ff5-42bc-9956-45d025d53f45\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.328984 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brmdb\" (UniqueName: \"kubernetes.io/projected/50560653-c3f8-4fa8-9d19-a1525b1daaa2-kube-api-access-brmdb\") pod \"swift-operator-controller-manager-84d6b4b759-j6dlz\" (UID: \"50560653-c3f8-4fa8-9d19-a1525b1daaa2\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz" Oct 01 10:30:31 crc kubenswrapper[4735]: E1001 10:30:31.329825 4735 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 10:30:31 crc kubenswrapper[4735]: E1001 10:30:31.329917 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef40c559-9b0d-478a-baa4-239ab6d71d76-cert podName:ef40c559-9b0d-478a-baa4-239ab6d71d76 nodeName:}" failed. No retries permitted until 2025-10-01 10:30:31.829861559 +0000 UTC m=+790.522682821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef40c559-9b0d-478a-baa4-239ab6d71d76-cert") pod "openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" (UID: "ef40c559-9b0d-478a-baa4-239ab6d71d76") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.358933 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q9l9\" (UniqueName: \"kubernetes.io/projected/9fba9aa7-25bd-4d48-89e9-818af62e38af-kube-api-access-6q9l9\") pod \"placement-operator-controller-manager-589c58c6c-p8c9g\" (UID: \"9fba9aa7-25bd-4d48-89e9-818af62e38af\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.359074 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.365810 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.368191 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brmdb\" (UniqueName: \"kubernetes.io/projected/50560653-c3f8-4fa8-9d19-a1525b1daaa2-kube-api-access-brmdb\") pod \"swift-operator-controller-manager-84d6b4b759-j6dlz\" (UID: \"50560653-c3f8-4fa8-9d19-a1525b1daaa2\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.376457 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.366486 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qqwn\" (UniqueName: \"kubernetes.io/projected/82c5ecde-4ff5-42bc-9956-45d025d53f45-kube-api-access-7qqwn\") pod \"telemetry-operator-controller-manager-b8d54b5d7-svvjg\" (UID: \"82c5ecde-4ff5-42bc-9956-45d025d53f45\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.377554 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.377698 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.379033 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.379058 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5x2qp" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.380069 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jvq8\" (UniqueName: \"kubernetes.io/projected/ef40c559-9b0d-478a-baa4-239ab6d71d76-kube-api-access-7jvq8\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cllzwc\" (UID: \"ef40c559-9b0d-478a-baa4-239ab6d71d76\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.386799 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.394086 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6tlw\" (UniqueName: \"kubernetes.io/projected/ef5f656f-602f-475c-8bd2-078c0bb43388-kube-api-access-r6tlw\") pod \"ovn-operator-controller-manager-9976ff44c-2pfnh\" (UID: \"ef5f656f-602f-475c-8bd2-078c0bb43388\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.395466 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.409207 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.429549 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.430815 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.432468 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-l6wd4" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.435730 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kptf8\" (UniqueName: \"kubernetes.io/projected/dfaa86bf-1e61-4393-ba0f-b9003fdbde80-kube-api-access-kptf8\") pod \"test-operator-controller-manager-85777745bb-5w5cv\" (UID: \"dfaa86bf-1e61-4393-ba0f-b9003fdbde80\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-5w5cv" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.435765 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nstm\" (UniqueName: \"kubernetes.io/projected/c4ac4b8f-7378-438b-8412-1b74d1c3fda9-kube-api-access-8nstm\") pod \"watcher-operator-controller-manager-6b9957f54f-f29js\" (UID: \"c4ac4b8f-7378-438b-8412-1b74d1c3fda9\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.436718 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.441032 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.455989 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.466151 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kptf8\" (UniqueName: \"kubernetes.io/projected/dfaa86bf-1e61-4393-ba0f-b9003fdbde80-kube-api-access-kptf8\") pod \"test-operator-controller-manager-85777745bb-5w5cv\" (UID: \"dfaa86bf-1e61-4393-ba0f-b9003fdbde80\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-5w5cv" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.469070 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nstm\" (UniqueName: \"kubernetes.io/projected/c4ac4b8f-7378-438b-8412-1b74d1c3fda9-kube-api-access-8nstm\") pod \"watcher-operator-controller-manager-6b9957f54f-f29js\" (UID: \"c4ac4b8f-7378-438b-8412-1b74d1c3fda9\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.488193 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.514372 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.537681 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a48cc547-6994-48eb-b8db-9682c091fdac-cert\") pod \"infra-operator-controller-manager-9d6c5db85-txp8d\" (UID: \"a48cc547-6994-48eb-b8db-9682c091fdac\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.537746 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86607e6f-a912-47f7-b72c-8ea925c5bd53-cert\") pod \"openstack-operator-controller-manager-6b86d7dbdd-k9n2c\" (UID: \"86607e6f-a912-47f7-b72c-8ea925c5bd53\") " pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.537783 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2c87\" (UniqueName: \"kubernetes.io/projected/86607e6f-a912-47f7-b72c-8ea925c5bd53-kube-api-access-q2c87\") pod \"openstack-operator-controller-manager-6b86d7dbdd-k9n2c\" (UID: \"86607e6f-a912-47f7-b72c-8ea925c5bd53\") " pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.537808 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lflp8\" (UniqueName: \"kubernetes.io/projected/d285f86e-bf4c-4e84-8b59-039754ffb39c-kube-api-access-lflp8\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p\" (UID: \"d285f86e-bf4c-4e84-8b59-039754ffb39c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.543295 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.553337 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-5w5cv" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.553664 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a48cc547-6994-48eb-b8db-9682c091fdac-cert\") pod \"infra-operator-controller-manager-9d6c5db85-txp8d\" (UID: \"a48cc547-6994-48eb-b8db-9682c091fdac\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.566257 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js" Oct 01 10:30:31 crc kubenswrapper[4735]: W1001 10:30:31.622053 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27b0c660_ca46_48cb_88ca_bb5715532c80.slice/crio-ed626f3f9803dc191ada4c7bfa4686773d4492b17a62411c31790211a11885e5 WatchSource:0}: Error finding container ed626f3f9803dc191ada4c7bfa4686773d4492b17a62411c31790211a11885e5: Status 404 returned error can't find the container with id ed626f3f9803dc191ada4c7bfa4686773d4492b17a62411c31790211a11885e5 Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.642284 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lflp8\" (UniqueName: \"kubernetes.io/projected/d285f86e-bf4c-4e84-8b59-039754ffb39c-kube-api-access-lflp8\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p\" (UID: \"d285f86e-bf4c-4e84-8b59-039754ffb39c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.642403 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86607e6f-a912-47f7-b72c-8ea925c5bd53-cert\") pod \"openstack-operator-controller-manager-6b86d7dbdd-k9n2c\" (UID: \"86607e6f-a912-47f7-b72c-8ea925c5bd53\") " pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.642433 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2c87\" (UniqueName: \"kubernetes.io/projected/86607e6f-a912-47f7-b72c-8ea925c5bd53-kube-api-access-q2c87\") pod \"openstack-operator-controller-manager-6b86d7dbdd-k9n2c\" (UID: \"86607e6f-a912-47f7-b72c-8ea925c5bd53\") " pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" Oct 01 10:30:31 crc kubenswrapper[4735]: E1001 10:30:31.643245 4735 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 01 10:30:31 crc kubenswrapper[4735]: E1001 10:30:31.643294 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86607e6f-a912-47f7-b72c-8ea925c5bd53-cert podName:86607e6f-a912-47f7-b72c-8ea925c5bd53 nodeName:}" failed. No retries permitted until 2025-10-01 10:30:32.143278804 +0000 UTC m=+790.836100066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/86607e6f-a912-47f7-b72c-8ea925c5bd53-cert") pod "openstack-operator-controller-manager-6b86d7dbdd-k9n2c" (UID: "86607e6f-a912-47f7-b72c-8ea925c5bd53") : secret "webhook-server-cert" not found Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.652898 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.661915 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lflp8\" (UniqueName: \"kubernetes.io/projected/d285f86e-bf4c-4e84-8b59-039754ffb39c-kube-api-access-lflp8\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p\" (UID: \"d285f86e-bf4c-4e84-8b59-039754ffb39c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.662394 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2c87\" (UniqueName: \"kubernetes.io/projected/86607e6f-a912-47f7-b72c-8ea925c5bd53-kube-api-access-q2c87\") pod \"openstack-operator-controller-manager-6b86d7dbdd-k9n2c\" (UID: \"86607e6f-a912-47f7-b72c-8ea925c5bd53\") " pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.718781 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.736098 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn"] Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.845886 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef40c559-9b0d-478a-baa4-239ab6d71d76-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cllzwc\" (UID: \"ef40c559-9b0d-478a-baa4-239ab6d71d76\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.855468 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef40c559-9b0d-478a-baa4-239ab6d71d76-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cllzwc\" (UID: \"ef40c559-9b0d-478a-baa4-239ab6d71d76\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.892702 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.908283 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p" Oct 01 10:30:31 crc kubenswrapper[4735]: I1001 10:30:31.972531 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.005432 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.013793 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.149800 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86607e6f-a912-47f7-b72c-8ea925c5bd53-cert\") pod \"openstack-operator-controller-manager-6b86d7dbdd-k9n2c\" (UID: \"86607e6f-a912-47f7-b72c-8ea925c5bd53\") " pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.172630 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86607e6f-a912-47f7-b72c-8ea925c5bd53-cert\") pod \"openstack-operator-controller-manager-6b86d7dbdd-k9n2c\" (UID: \"86607e6f-a912-47f7-b72c-8ea925c5bd53\") " pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.181949 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.247594 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp" event={"ID":"8a30378d-5948-4a39-b5ec-85b29f5763e9","Type":"ContainerStarted","Data":"e8d71012b1c4f0bb3275891867bd7172aa5b6a4b13f2625f5a288d0ce12ed601"} Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.249924 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6" event={"ID":"263d49ed-e577-457b-b887-33f95f1bbed0","Type":"ContainerStarted","Data":"e2f69760b4d98524edc2165b7f55669d0247b134ac881864836ee8db150abd89"} Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.251235 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn" event={"ID":"db6d8b12-0a21-40a5-b23a-e943494a2091","Type":"ContainerStarted","Data":"85f646deaf473db80abc3ba117f2c3aff75d0667aab43d57394003b42c4ba667"} Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.252380 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl" event={"ID":"27b0c660-ca46-48cb-88ca-bb5715532c80","Type":"ContainerStarted","Data":"ed626f3f9803dc191ada4c7bfa4686773d4492b17a62411c31790211a11885e5"} Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.253161 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf" event={"ID":"bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98","Type":"ContainerStarted","Data":"a1ff787a96dcdfef2b939832d31a006d04ade230f92346f19053b25a3022d381"} Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.304995 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.317547 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.440406 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.443765 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9"] Oct 01 10:30:32 crc kubenswrapper[4735]: W1001 10:30:32.447185 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc229f8d8_2a3c_4a42_ae75_f83e7bbc99e9.slice/crio-5ff426fd081d612a08fe676c97f48764162a693f5a45c29c0a9c709ff215a774 WatchSource:0}: Error finding container 5ff426fd081d612a08fe676c97f48764162a693f5a45c29c0a9c709ff215a774: Status 404 returned error can't find the container with id 5ff426fd081d612a08fe676c97f48764162a693f5a45c29c0a9c709ff215a774 Oct 01 10:30:32 crc kubenswrapper[4735]: W1001 10:30:32.450151 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod912f537c_db9b_4256_a5e0_81dc33bcaf3e.slice/crio-58a7a139edfe5024e64608d8381b0905f7a3037b0b68c0f191f130a57775db76 WatchSource:0}: Error finding container 58a7a139edfe5024e64608d8381b0905f7a3037b0b68c0f191f130a57775db76: Status 404 returned error can't find the container with id 58a7a139edfe5024e64608d8381b0905f7a3037b0b68c0f191f130a57775db76 Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.648235 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.654996 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.667150 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.675450 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.867781 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-5w5cv"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.881591 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.891556 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.895966 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.918551 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh"] Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.929912 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv"] Oct 01 10:30:32 crc kubenswrapper[4735]: E1001 10:30:32.943655 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zdhdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7b787867f4-lg9mq_openstack-operators(6b1d4e62-3eb6-4090-826d-e627e08c73c6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 10:30:32 crc kubenswrapper[4735]: W1001 10:30:32.969151 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fd0c004_178e_41cb_be27_2d2342d9f58c.slice/crio-0a21d50b7ce4b1be9908775632a53612ebbad8b7f694f5d91031391aeeb245c4 WatchSource:0}: Error finding container 0a21d50b7ce4b1be9908775632a53612ebbad8b7f694f5d91031391aeeb245c4: Status 404 returned error can't find the container with id 0a21d50b7ce4b1be9908775632a53612ebbad8b7f694f5d91031391aeeb245c4 Oct 01 10:30:32 crc kubenswrapper[4735]: I1001 10:30:32.985771 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz"] Oct 01 10:30:33 crc kubenswrapper[4735]: E1001 10:30:33.004551 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fd4rz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-64cd67b5cb-mr6vv_openstack-operators(3fd0c004-178e-41cb-be27-2d2342d9f58c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 10:30:33 crc kubenswrapper[4735]: E1001 10:30:33.004552 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r6tlw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-2pfnh_openstack-operators(ef5f656f-602f-475c-8bd2-078c0bb43388): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.005870 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc"] Oct 01 10:30:33 crc kubenswrapper[4735]: W1001 10:30:33.025031 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50560653_c3f8_4fa8_9d19_a1525b1daaa2.slice/crio-987ee2d4d7da66d3a263b3d1b88147869c82fb725f14ee202623b4edfe285e16 WatchSource:0}: Error finding container 987ee2d4d7da66d3a263b3d1b88147869c82fb725f14ee202623b4edfe285e16: Status 404 returned error can't find the container with id 987ee2d4d7da66d3a263b3d1b88147869c82fb725f14ee202623b4edfe285e16 Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.042588 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p"] Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.102913 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c"] Oct 01 10:30:33 crc kubenswrapper[4735]: E1001 10:30:33.110430 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lflp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p_openstack-operators(d285f86e-bf4c-4e84-8b59-039754ffb39c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 10:30:33 crc kubenswrapper[4735]: E1001 10:30:33.112955 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p" podUID="d285f86e-bf4c-4e84-8b59-039754ffb39c" Oct 01 10:30:33 crc kubenswrapper[4735]: E1001 10:30:33.114056 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7jvq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-77b9676b8cllzwc_openstack-operators(ef40c559-9b0d-478a-baa4-239ab6d71d76): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.265009 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg" event={"ID":"663344ae-dd00-416c-9120-d4f0721554b4","Type":"ContainerStarted","Data":"df261a2a5079a3c42fd5228f46c65d7d1f2d19293e6c46360a1c0ffcbe550aee"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.267055 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js" event={"ID":"c4ac4b8f-7378-438b-8412-1b74d1c3fda9","Type":"ContainerStarted","Data":"6629ba3144568eef1abcc4d266b14e44baed563ee26edfc93ccd54cfbc7ef5bb"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.272635 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" event={"ID":"ef40c559-9b0d-478a-baa4-239ab6d71d76","Type":"ContainerStarted","Data":"b2fbb8f6066d5296a84d205cc0724d93ff79d1fa226b536287cfd6852a66e053"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.273799 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" event={"ID":"6b1d4e62-3eb6-4090-826d-e627e08c73c6","Type":"ContainerStarted","Data":"8060b3f7c62613580d54c10c2d1129dfa1e7c7b1f852bf7b1f1a48b6dc7b3b8b"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.279875 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9" event={"ID":"912f537c-db9b-4256-a5e0-81dc33bcaf3e","Type":"ContainerStarted","Data":"58a7a139edfe5024e64608d8381b0905f7a3037b0b68c0f191f130a57775db76"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.281222 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-5w5cv" event={"ID":"dfaa86bf-1e61-4393-ba0f-b9003fdbde80","Type":"ContainerStarted","Data":"dfda149bc14a45ad8961616db83b5d9ceb4d11bb6e2b993af3dd698545b4c13c"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.282303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5" event={"ID":"a1e6fcf2-bfad-48fe-b655-0e9199818230","Type":"ContainerStarted","Data":"731c8036398824b61d359cf6d7e957f4b8349d047c0c9286fc009283de64b5d6"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.285136 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" event={"ID":"ef5f656f-602f-475c-8bd2-078c0bb43388","Type":"ContainerStarted","Data":"c5053388502b6cd82ad43cd14b80368e1ee5ea2afc0fc793d974fc15fe045300"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.287027 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g" event={"ID":"9fba9aa7-25bd-4d48-89e9-818af62e38af","Type":"ContainerStarted","Data":"802853f4465598ef105b51f8bd8d72ab0f942fd5f78b962f87ffaf0a166287ec"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.288520 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" event={"ID":"86607e6f-a912-47f7-b72c-8ea925c5bd53","Type":"ContainerStarted","Data":"a5b5cd5567f398a5765d868dbe3b61f10da1ac38b03a3f1d7972c44a57a56080"} Oct 01 10:30:33 crc kubenswrapper[4735]: E1001 10:30:33.288986 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" podUID="3fd0c004-178e-41cb-be27-2d2342d9f58c" Oct 01 10:30:33 crc kubenswrapper[4735]: E1001 10:30:33.298916 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" podUID="6b1d4e62-3eb6-4090-826d-e627e08c73c6" Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.301295 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" event={"ID":"a48cc547-6994-48eb-b8db-9682c091fdac","Type":"ContainerStarted","Data":"21a3950092662929245616016538a1f96c587d9aff465ba7e67d16b980eda687"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.309352 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz" event={"ID":"50560653-c3f8-4fa8-9d19-a1525b1daaa2","Type":"ContainerStarted","Data":"987ee2d4d7da66d3a263b3d1b88147869c82fb725f14ee202623b4edfe285e16"} Oct 01 10:30:33 crc kubenswrapper[4735]: E1001 10:30:33.309932 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" podUID="ef5f656f-602f-475c-8bd2-078c0bb43388" Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.317656 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p" event={"ID":"d285f86e-bf4c-4e84-8b59-039754ffb39c","Type":"ContainerStarted","Data":"a33a92cba2f6bfb0d7cc7d7b8253c3ac49fb3c56bcb543f6e02691173b0a29d2"} Oct 01 10:30:33 crc kubenswrapper[4735]: E1001 10:30:33.319713 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p" podUID="d285f86e-bf4c-4e84-8b59-039754ffb39c" Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.325631 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx" event={"ID":"7df184a1-eb46-4e19-85cd-82e8d5da1880","Type":"ContainerStarted","Data":"e0ffa784fe7f1367c82a0c6c46a069ff8d2e1adaeb4bdcbb1b6c922d6b56d814"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.328458 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg" event={"ID":"82c5ecde-4ff5-42bc-9956-45d025d53f45","Type":"ContainerStarted","Data":"f38b6b01f8055f7f6e7c9779e012ade4fdd0f7cb6d34d1a984e263e20a7b2d18"} Oct 01 10:30:33 crc kubenswrapper[4735]: E1001 10:30:33.332460 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" podUID="ef40c559-9b0d-478a-baa4-239ab6d71d76" Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.332623 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7" event={"ID":"c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9","Type":"ContainerStarted","Data":"5ff426fd081d612a08fe676c97f48764162a693f5a45c29c0a9c709ff215a774"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.334874 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg" event={"ID":"628c079b-418b-4299-9394-a59ab2850d23","Type":"ContainerStarted","Data":"73f64047b12df8a91a98d83196133c5c002eb7ee18e76c23ade21b20aeac0ff9"} Oct 01 10:30:33 crc kubenswrapper[4735]: I1001 10:30:33.336539 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" event={"ID":"3fd0c004-178e-41cb-be27-2d2342d9f58c","Type":"ContainerStarted","Data":"0a21d50b7ce4b1be9908775632a53612ebbad8b7f694f5d91031391aeeb245c4"} Oct 01 10:30:33 crc kubenswrapper[4735]: E1001 10:30:33.337772 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" podUID="3fd0c004-178e-41cb-be27-2d2342d9f58c" Oct 01 10:30:34 crc kubenswrapper[4735]: I1001 10:30:34.350434 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" event={"ID":"86607e6f-a912-47f7-b72c-8ea925c5bd53","Type":"ContainerStarted","Data":"39f3f44220b29abd99c09884772db3c1ea030023514e3d9e24e512052cf949aa"} Oct 01 10:30:34 crc kubenswrapper[4735]: I1001 10:30:34.350475 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" event={"ID":"86607e6f-a912-47f7-b72c-8ea925c5bd53","Type":"ContainerStarted","Data":"4761430d84d831167bb720c9fd3adf62530b93854ba5cc297edd49758b9cb1e0"} Oct 01 10:30:34 crc kubenswrapper[4735]: I1001 10:30:34.356479 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" Oct 01 10:30:34 crc kubenswrapper[4735]: I1001 10:30:34.357974 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" event={"ID":"ef40c559-9b0d-478a-baa4-239ab6d71d76","Type":"ContainerStarted","Data":"e7b488c86eaf3233664506c640a983ef8b2ff75b1275d7f1a0325ee1016a78c2"} Oct 01 10:30:34 crc kubenswrapper[4735]: E1001 10:30:34.360356 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" podUID="ef40c559-9b0d-478a-baa4-239ab6d71d76" Oct 01 10:30:34 crc kubenswrapper[4735]: I1001 10:30:34.360773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" event={"ID":"3fd0c004-178e-41cb-be27-2d2342d9f58c","Type":"ContainerStarted","Data":"5ba8101c05a686a6f2e322846bc7ea6ccc00d61e2c0db9ad3435bed5b789016e"} Oct 01 10:30:34 crc kubenswrapper[4735]: E1001 10:30:34.362718 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" podUID="3fd0c004-178e-41cb-be27-2d2342d9f58c" Oct 01 10:30:34 crc kubenswrapper[4735]: I1001 10:30:34.364211 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" event={"ID":"6b1d4e62-3eb6-4090-826d-e627e08c73c6","Type":"ContainerStarted","Data":"674dff3fa99e2c360e9c8f474bf56f4426f2285860d1017ff52f86e7691d9a87"} Oct 01 10:30:34 crc kubenswrapper[4735]: E1001 10:30:34.365252 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" podUID="6b1d4e62-3eb6-4090-826d-e627e08c73c6" Oct 01 10:30:34 crc kubenswrapper[4735]: I1001 10:30:34.366399 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" event={"ID":"ef5f656f-602f-475c-8bd2-078c0bb43388","Type":"ContainerStarted","Data":"aea15146ab49f0ddec341821b49664a8c798378989104de455a89fd7b117cf5a"} Oct 01 10:30:34 crc kubenswrapper[4735]: E1001 10:30:34.367412 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" podUID="ef5f656f-602f-475c-8bd2-078c0bb43388" Oct 01 10:30:34 crc kubenswrapper[4735]: E1001 10:30:34.367816 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p" podUID="d285f86e-bf4c-4e84-8b59-039754ffb39c" Oct 01 10:30:34 crc kubenswrapper[4735]: I1001 10:30:34.396245 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" podStartSLOduration=3.396230221 podStartE2EDuration="3.396230221s" podCreationTimestamp="2025-10-01 10:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:30:34.391106708 +0000 UTC m=+793.083927970" watchObservedRunningTime="2025-10-01 10:30:34.396230221 +0000 UTC m=+793.089051483" Oct 01 10:30:35 crc kubenswrapper[4735]: E1001 10:30:35.377910 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" podUID="6b1d4e62-3eb6-4090-826d-e627e08c73c6" Oct 01 10:30:35 crc kubenswrapper[4735]: E1001 10:30:35.378428 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" podUID="ef40c559-9b0d-478a-baa4-239ab6d71d76" Oct 01 10:30:35 crc kubenswrapper[4735]: E1001 10:30:35.380433 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" podUID="3fd0c004-178e-41cb-be27-2d2342d9f58c" Oct 01 10:30:35 crc kubenswrapper[4735]: E1001 10:30:35.380657 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" podUID="ef5f656f-602f-475c-8bd2-078c0bb43388" Oct 01 10:30:35 crc kubenswrapper[4735]: I1001 10:30:35.485881 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:30:35 crc kubenswrapper[4735]: I1001 10:30:35.485929 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:30:35 crc kubenswrapper[4735]: I1001 10:30:35.485973 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:30:35 crc kubenswrapper[4735]: I1001 10:30:35.486507 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2af2e089638b91b79b45466ec706a3fb255cc6d4b97f5b95a6d6830b44a807b5"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 10:30:35 crc kubenswrapper[4735]: I1001 10:30:35.486557 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://2af2e089638b91b79b45466ec706a3fb255cc6d4b97f5b95a6d6830b44a807b5" gracePeriod=600 Oct 01 10:30:36 crc kubenswrapper[4735]: I1001 10:30:36.381949 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="2af2e089638b91b79b45466ec706a3fb255cc6d4b97f5b95a6d6830b44a807b5" exitCode=0 Oct 01 10:30:36 crc kubenswrapper[4735]: I1001 10:30:36.382662 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"2af2e089638b91b79b45466ec706a3fb255cc6d4b97f5b95a6d6830b44a807b5"} Oct 01 10:30:36 crc kubenswrapper[4735]: I1001 10:30:36.382693 4735 scope.go:117] "RemoveContainer" containerID="218e50335c2b017baada525f436dc7da1909a27a86b76e27c3f9d13a94f70329" Oct 01 10:30:42 crc kubenswrapper[4735]: I1001 10:30:42.187392 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6b86d7dbdd-k9n2c" Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.438568 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7" event={"ID":"c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9","Type":"ContainerStarted","Data":"281c7727bceef0834c5fca19e2696777e5854409ce790bc1cf39aa3d03c3caa4"} Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.443291 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" event={"ID":"a48cc547-6994-48eb-b8db-9682c091fdac","Type":"ContainerStarted","Data":"d24961fd791fa756bcbe2b13bd7e1585c7f5dcfbec65b4f84b68d0fe10d55183"} Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.446012 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf" event={"ID":"bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98","Type":"ContainerStarted","Data":"e2d67548783c0d76ff81a225d072361abd0cd40e5a1c0977c7d498c028c3ef13"} Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.447815 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg" event={"ID":"82c5ecde-4ff5-42bc-9956-45d025d53f45","Type":"ContainerStarted","Data":"15726c6402b5b694699c3e0cafc29983efd4d9ea17e08a4421509340769d08f9"} Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.448954 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg" event={"ID":"663344ae-dd00-416c-9120-d4f0721554b4","Type":"ContainerStarted","Data":"375e4c76ca60b99be2a2e6cbb10f4496e4b2bb2873e2b1b760fad24a91e8b43f"} Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.451643 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js" event={"ID":"c4ac4b8f-7378-438b-8412-1b74d1c3fda9","Type":"ContainerStarted","Data":"e22c0c3844491b0446f297fe83f832d781590cb4397a30dde95bce63793f4af3"} Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.458125 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g" event={"ID":"9fba9aa7-25bd-4d48-89e9-818af62e38af","Type":"ContainerStarted","Data":"c69e497ffc1c4fa0c20b1aaf25bf38f627c7d6b92e4114e79c3665fce046c852"} Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.459352 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-5w5cv" event={"ID":"dfaa86bf-1e61-4393-ba0f-b9003fdbde80","Type":"ContainerStarted","Data":"905af34952eacdb46344ffbf34b9b13d9e6ce37e8713ce610580e70ac5fccdce"} Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.460637 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9" event={"ID":"912f537c-db9b-4256-a5e0-81dc33bcaf3e","Type":"ContainerStarted","Data":"8b176835f9633b1e305062a01bae667f470faaac0b00f6ecd189149e0b45e0fc"} Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.466029 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"1d509f3e1d9829219adbe6f0a296874023b5cdfe25a87df90afeebbd5d68c288"} Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.468169 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6" event={"ID":"263d49ed-e577-457b-b887-33f95f1bbed0","Type":"ContainerStarted","Data":"6e63cfb5ef9e90ae138e8e6174173fb903bd43a662034599c1d98c9612c45b2f"} Oct 01 10:30:44 crc kubenswrapper[4735]: I1001 10:30:44.474066 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg" event={"ID":"628c079b-418b-4299-9394-a59ab2850d23","Type":"ContainerStarted","Data":"58813c47ca7206f8f46af6b82f6ba157637cb2445649cf65abf75b31514f6347"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.487638 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g" event={"ID":"9fba9aa7-25bd-4d48-89e9-818af62e38af","Type":"ContainerStarted","Data":"fd17fc1619fcde4459e9bc34573cbb01b02eb1de136d76ec1bb8072b80d4bf9c"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.488742 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.490566 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7" event={"ID":"c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9","Type":"ContainerStarted","Data":"b5fe4415168efc43f4dba300d589b4276f35b7ba07082e06a61b99739569ab0a"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.490944 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.498842 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5" event={"ID":"a1e6fcf2-bfad-48fe-b655-0e9199818230","Type":"ContainerStarted","Data":"671737b85cd4a458929db9afad1e3101477985f1ad8e4ef95d02833a2ac8c5f8"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.498874 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5" event={"ID":"a1e6fcf2-bfad-48fe-b655-0e9199818230","Type":"ContainerStarted","Data":"3377ce70015e99ffc7f6f4177ddb6465ec4e9d2c0468663dac9551c81bf16c1d"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.498991 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.503311 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg" event={"ID":"82c5ecde-4ff5-42bc-9956-45d025d53f45","Type":"ContainerStarted","Data":"40db61f3e9e50df5f8201bcaf943103326f1a06996254843e4fc3e0e806fdc27"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.503462 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.506731 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz" event={"ID":"50560653-c3f8-4fa8-9d19-a1525b1daaa2","Type":"ContainerStarted","Data":"7a219f6df6a5f1dd24a65c0a43e45c17cdf026a2363109a30ef6b79a5e8d9735"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.507292 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g" podStartSLOduration=3.5330586889999998 podStartE2EDuration="14.507281811s" podCreationTimestamp="2025-10-01 10:30:31 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.709212947 +0000 UTC m=+791.402034209" lastFinishedPulling="2025-10-01 10:30:43.683436069 +0000 UTC m=+802.376257331" observedRunningTime="2025-10-01 10:30:45.50476101 +0000 UTC m=+804.197582272" watchObservedRunningTime="2025-10-01 10:30:45.507281811 +0000 UTC m=+804.200103063" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.514279 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9" event={"ID":"912f537c-db9b-4256-a5e0-81dc33bcaf3e","Type":"ContainerStarted","Data":"7824261d8111f2fa63bb92124697cd1d55c4540f16eff179cfb16e929ac7a53c"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.514713 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.524656 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.527861 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5" podStartSLOduration=4.165329157 podStartE2EDuration="15.527849361s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.326807735 +0000 UTC m=+791.019628997" lastFinishedPulling="2025-10-01 10:30:43.689327939 +0000 UTC m=+802.382149201" observedRunningTime="2025-10-01 10:30:45.525206759 +0000 UTC m=+804.218028011" watchObservedRunningTime="2025-10-01 10:30:45.527849361 +0000 UTC m=+804.220670623" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.531729 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl" event={"ID":"27b0c660-ca46-48cb-88ca-bb5715532c80","Type":"ContainerStarted","Data":"2b33c7cf872c460ac7542a82e22bfb07fdda869e2f9ee13d6088da764062a165"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.532258 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.537828 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf" event={"ID":"bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98","Type":"ContainerStarted","Data":"24bb84912461088cad90be39f146df4b8a9e316673f46d183ee4c13192f321b8"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.538423 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.540125 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx" event={"ID":"7df184a1-eb46-4e19-85cd-82e8d5da1880","Type":"ContainerStarted","Data":"37c94f34a67cf058d175b293ba05f0dcbf940cf82fa3163ec2dea91f80398d8b"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.540169 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx" event={"ID":"7df184a1-eb46-4e19-85cd-82e8d5da1880","Type":"ContainerStarted","Data":"538d65505c8cae2a9a0b56c983d43b5ff5ee6ad869a3b2d7f753c5e83d5fd89d"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.540241 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.543168 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7" podStartSLOduration=4.309241634 podStartE2EDuration="15.543157767s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.449648279 +0000 UTC m=+791.142469541" lastFinishedPulling="2025-10-01 10:30:43.683564392 +0000 UTC m=+802.376385674" observedRunningTime="2025-10-01 10:30:45.539253094 +0000 UTC m=+804.232074356" watchObservedRunningTime="2025-10-01 10:30:45.543157767 +0000 UTC m=+804.235979029" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.548502 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp" event={"ID":"8a30378d-5948-4a39-b5ec-85b29f5763e9","Type":"ContainerStarted","Data":"6a7e0143cea1145a0279c83fe071cfc7c720ec6d6f77a4b37635c48178ca45de"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.548701 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.553367 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg" event={"ID":"663344ae-dd00-416c-9120-d4f0721554b4","Type":"ContainerStarted","Data":"f8e45a58d59a1b57c574c17a9698563c10007735389aa8704e95ecb1ebe61b17"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.553894 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.558604 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js" event={"ID":"c4ac4b8f-7378-438b-8412-1b74d1c3fda9","Type":"ContainerStarted","Data":"6c367b4981415734a8da6a9ffa0dfd77591361043860f8290741a5e6f85d131d"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.558728 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.561028 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-5w5cv" event={"ID":"dfaa86bf-1e61-4393-ba0f-b9003fdbde80","Type":"ContainerStarted","Data":"67034e7f042347cae95c9e0be6214a795bd1ad17130f3bf1233c973e2144dd00"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.561427 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-5w5cv" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.563034 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6" event={"ID":"263d49ed-e577-457b-b887-33f95f1bbed0","Type":"ContainerStarted","Data":"91be6eb76ea07e5e7bd4fc1e651675b7d8815dcf3b1ab12b0c3f2e641a7657c4"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.563394 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.571183 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg" event={"ID":"628c079b-418b-4299-9394-a59ab2850d23","Type":"ContainerStarted","Data":"6b6ff81c084e39b84265d17a890b9f152ad7581e716c20b2ebec6be3b917b37b"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.571827 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.575773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn" event={"ID":"db6d8b12-0a21-40a5-b23a-e943494a2091","Type":"ContainerStarted","Data":"05031374d99860ad1c962a817c37a9338197f376e162613f9f99a359e99aaa6e"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.575806 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn" event={"ID":"db6d8b12-0a21-40a5-b23a-e943494a2091","Type":"ContainerStarted","Data":"64bd368bfc47edb0039215af95ad69c1d056650a798760d7770fce2ddc960707"} Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.575821 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.586384 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg" podStartSLOduration=3.56619449 podStartE2EDuration="14.586367079s" podCreationTimestamp="2025-10-01 10:30:31 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.665467013 +0000 UTC m=+791.358288275" lastFinishedPulling="2025-10-01 10:30:43.685639592 +0000 UTC m=+802.378460864" observedRunningTime="2025-10-01 10:30:45.569193159 +0000 UTC m=+804.262014421" watchObservedRunningTime="2025-10-01 10:30:45.586367079 +0000 UTC m=+804.279188341" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.590817 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg" podStartSLOduration=4.5868799639999995 podStartE2EDuration="15.590805995s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.68461653 +0000 UTC m=+791.377437782" lastFinishedPulling="2025-10-01 10:30:43.688542551 +0000 UTC m=+802.381363813" observedRunningTime="2025-10-01 10:30:45.58601 +0000 UTC m=+804.278831262" watchObservedRunningTime="2025-10-01 10:30:45.590805995 +0000 UTC m=+804.283627257" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.604278 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx" podStartSLOduration=4.577059879 podStartE2EDuration="15.604262336s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.658581268 +0000 UTC m=+791.351402540" lastFinishedPulling="2025-10-01 10:30:43.685783745 +0000 UTC m=+802.378604997" observedRunningTime="2025-10-01 10:30:45.599863691 +0000 UTC m=+804.292684953" watchObservedRunningTime="2025-10-01 10:30:45.604262336 +0000 UTC m=+804.297083598" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.628457 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" podStartSLOduration=4.862137155 podStartE2EDuration="15.628442533s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.921929536 +0000 UTC m=+791.614750798" lastFinishedPulling="2025-10-01 10:30:43.688234914 +0000 UTC m=+802.381056176" observedRunningTime="2025-10-01 10:30:45.625905993 +0000 UTC m=+804.318727255" watchObservedRunningTime="2025-10-01 10:30:45.628442533 +0000 UTC m=+804.321263795" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.678031 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6" podStartSLOduration=4.071087118 podStartE2EDuration="15.678012928s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.078115857 +0000 UTC m=+790.770937109" lastFinishedPulling="2025-10-01 10:30:43.685041647 +0000 UTC m=+802.377862919" observedRunningTime="2025-10-01 10:30:45.657805775 +0000 UTC m=+804.350627047" watchObservedRunningTime="2025-10-01 10:30:45.678012928 +0000 UTC m=+804.370834190" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.711408 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-5w5cv" podStartSLOduration=3.9452229 podStartE2EDuration="14.711393544s" podCreationTimestamp="2025-10-01 10:30:31 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.921633199 +0000 UTC m=+791.614454461" lastFinishedPulling="2025-10-01 10:30:43.687803833 +0000 UTC m=+802.380625105" observedRunningTime="2025-10-01 10:30:45.709186162 +0000 UTC m=+804.402007424" watchObservedRunningTime="2025-10-01 10:30:45.711393544 +0000 UTC m=+804.404214806" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.726208 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp" podStartSLOduration=4.115851586 podStartE2EDuration="15.726192608s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.078116877 +0000 UTC m=+790.770938139" lastFinishedPulling="2025-10-01 10:30:43.688457889 +0000 UTC m=+802.381279161" observedRunningTime="2025-10-01 10:30:45.723749459 +0000 UTC m=+804.416570721" watchObservedRunningTime="2025-10-01 10:30:45.726192608 +0000 UTC m=+804.419013870" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.742419 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js" podStartSLOduration=3.964024319 podStartE2EDuration="14.742401645s" podCreationTimestamp="2025-10-01 10:30:31 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.909853428 +0000 UTC m=+791.602674690" lastFinishedPulling="2025-10-01 10:30:43.688230754 +0000 UTC m=+802.381052016" observedRunningTime="2025-10-01 10:30:45.738523742 +0000 UTC m=+804.431345004" watchObservedRunningTime="2025-10-01 10:30:45.742401645 +0000 UTC m=+804.435222907" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.771805 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl" podStartSLOduration=3.710053067 podStartE2EDuration="15.771788487s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:31.630522749 +0000 UTC m=+790.323344011" lastFinishedPulling="2025-10-01 10:30:43.692258169 +0000 UTC m=+802.385079431" observedRunningTime="2025-10-01 10:30:45.767080564 +0000 UTC m=+804.459901826" watchObservedRunningTime="2025-10-01 10:30:45.771788487 +0000 UTC m=+804.464609739" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.787444 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf" podStartSLOduration=4.150142385 podStartE2EDuration="15.7874291s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.047463575 +0000 UTC m=+790.740284837" lastFinishedPulling="2025-10-01 10:30:43.68475029 +0000 UTC m=+802.377571552" observedRunningTime="2025-10-01 10:30:45.785533354 +0000 UTC m=+804.478354616" watchObservedRunningTime="2025-10-01 10:30:45.7874291 +0000 UTC m=+804.480250362" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.817902 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9" podStartSLOduration=4.585938272 podStartE2EDuration="15.817886898s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.451443382 +0000 UTC m=+791.144264644" lastFinishedPulling="2025-10-01 10:30:43.683391998 +0000 UTC m=+802.376213270" observedRunningTime="2025-10-01 10:30:45.813719378 +0000 UTC m=+804.506540640" watchObservedRunningTime="2025-10-01 10:30:45.817886898 +0000 UTC m=+804.510708160" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.836797 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn" podStartSLOduration=3.966534439 podStartE2EDuration="15.836782058s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:31.81863077 +0000 UTC m=+790.511452032" lastFinishedPulling="2025-10-01 10:30:43.688878389 +0000 UTC m=+802.381699651" observedRunningTime="2025-10-01 10:30:45.830376165 +0000 UTC m=+804.523197427" watchObservedRunningTime="2025-10-01 10:30:45.836782058 +0000 UTC m=+804.529603320" Oct 01 10:30:45 crc kubenswrapper[4735]: I1001 10:30:45.858258 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg" podStartSLOduration=4.481473387 podStartE2EDuration="15.858243931s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.312587886 +0000 UTC m=+791.005409158" lastFinishedPulling="2025-10-01 10:30:43.68935844 +0000 UTC m=+802.382179702" observedRunningTime="2025-10-01 10:30:45.85318599 +0000 UTC m=+804.546007252" watchObservedRunningTime="2025-10-01 10:30:45.858243931 +0000 UTC m=+804.551065193" Oct 01 10:30:46 crc kubenswrapper[4735]: I1001 10:30:46.582228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp" event={"ID":"8a30378d-5948-4a39-b5ec-85b29f5763e9","Type":"ContainerStarted","Data":"a4abe5a355bc833c73443e8a0b346b426b847b50760b39d8794ac0fdc1f07b90"} Oct 01 10:30:46 crc kubenswrapper[4735]: I1001 10:30:46.583767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz" event={"ID":"50560653-c3f8-4fa8-9d19-a1525b1daaa2","Type":"ContainerStarted","Data":"9f6544a262dc8f64be07c15df593687845149c83672fbae5cdf9a76c336b3e5c"} Oct 01 10:30:46 crc kubenswrapper[4735]: I1001 10:30:46.583849 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz" Oct 01 10:30:46 crc kubenswrapper[4735]: I1001 10:30:46.586153 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" event={"ID":"a48cc547-6994-48eb-b8db-9682c091fdac","Type":"ContainerStarted","Data":"b80df033a3c428ab0a7a86282480931d96a7babed49b072bc222f33218282fc6"} Oct 01 10:30:46 crc kubenswrapper[4735]: I1001 10:30:46.588316 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl" event={"ID":"27b0c660-ca46-48cb-88ca-bb5715532c80","Type":"ContainerStarted","Data":"ac8ba72eba4ae96ed66690328f408ea00988d87a80b66026d8361975c2a2a063"} Oct 01 10:30:46 crc kubenswrapper[4735]: I1001 10:30:46.613110 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz" podStartSLOduration=5.010815286 podStartE2EDuration="15.613077405s" podCreationTimestamp="2025-10-01 10:30:31 +0000 UTC" firstStartedPulling="2025-10-01 10:30:33.08663738 +0000 UTC m=+791.779458632" lastFinishedPulling="2025-10-01 10:30:43.688899489 +0000 UTC m=+802.381720751" observedRunningTime="2025-10-01 10:30:46.603743253 +0000 UTC m=+805.296564515" watchObservedRunningTime="2025-10-01 10:30:46.613077405 +0000 UTC m=+805.305898717" Oct 01 10:30:50 crc kubenswrapper[4735]: I1001 10:30:50.925659 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-v88gl" Oct 01 10:30:50 crc kubenswrapper[4735]: I1001 10:30:50.960316 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-l5lmn" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.047176 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-hgmvf" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.051901 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-fj5j6" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.193155 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-hw4lp" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.224271 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-7hcv5" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.252706 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wftzg" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.361935 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-m5vc9" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.368582 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-qp9c7" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.383234 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-svvjg" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.389874 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-zsvgx" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.416873 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-p8c9g" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.447036 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-9k2wg" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.461391 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-j6dlz" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.556185 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-5w5cv" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.573739 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f29js" Oct 01 10:30:51 crc kubenswrapper[4735]: I1001 10:30:51.726988 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-txp8d" Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.698730 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" event={"ID":"ef5f656f-602f-475c-8bd2-078c0bb43388","Type":"ContainerStarted","Data":"1541bb321a2494332f73b402e8cb9085a2f68c4f5555861c2b0e9094d2e0691a"} Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.699507 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.701040 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" event={"ID":"ef40c559-9b0d-478a-baa4-239ab6d71d76","Type":"ContainerStarted","Data":"1dc412d7e12b3a0686b6e779c227f697353a034729787f24c08793d3251f9357"} Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.701239 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.703884 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" event={"ID":"3fd0c004-178e-41cb-be27-2d2342d9f58c","Type":"ContainerStarted","Data":"0eeae82f1636e60558078b68875fe4352222dc9ca59a30544a9654467cedf14e"} Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.704056 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.705872 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p" event={"ID":"d285f86e-bf4c-4e84-8b59-039754ffb39c","Type":"ContainerStarted","Data":"69f0f0873b183545667589344bec8c5a2d70635435ddf9dd86a8edd4a3f913fc"} Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.707961 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" event={"ID":"6b1d4e62-3eb6-4090-826d-e627e08c73c6","Type":"ContainerStarted","Data":"4f4b2b408ba5942c23b224a257d1cfbd0a43c48ae5f190aff21665c03a5be34c"} Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.708195 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.718401 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" podStartSLOduration=3.979581071 podStartE2EDuration="29.718383494s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:33.004384975 +0000 UTC m=+791.697206237" lastFinishedPulling="2025-10-01 10:30:58.743187368 +0000 UTC m=+817.436008660" observedRunningTime="2025-10-01 10:30:59.716578762 +0000 UTC m=+818.409400024" watchObservedRunningTime="2025-10-01 10:30:59.718383494 +0000 UTC m=+818.411204766" Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.736482 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" podStartSLOduration=3.962800191 podStartE2EDuration="29.736456676s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:33.004392436 +0000 UTC m=+791.697213698" lastFinishedPulling="2025-10-01 10:30:58.778048881 +0000 UTC m=+817.470870183" observedRunningTime="2025-10-01 10:30:59.731930688 +0000 UTC m=+818.424751950" watchObservedRunningTime="2025-10-01 10:30:59.736456676 +0000 UTC m=+818.429277958" Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.761085 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" podStartSLOduration=4.116238154 podStartE2EDuration="29.761067904s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:33.098360929 +0000 UTC m=+791.791182191" lastFinishedPulling="2025-10-01 10:30:58.743190659 +0000 UTC m=+817.436011941" observedRunningTime="2025-10-01 10:30:59.760918561 +0000 UTC m=+818.453739843" watchObservedRunningTime="2025-10-01 10:30:59.761067904 +0000 UTC m=+818.453889166" Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.781176 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" podStartSLOduration=3.915144033 podStartE2EDuration="29.781155603s" podCreationTimestamp="2025-10-01 10:30:30 +0000 UTC" firstStartedPulling="2025-10-01 10:30:32.943547053 +0000 UTC m=+791.636368315" lastFinishedPulling="2025-10-01 10:30:58.809558603 +0000 UTC m=+817.502379885" observedRunningTime="2025-10-01 10:30:59.774565747 +0000 UTC m=+818.467387009" watchObservedRunningTime="2025-10-01 10:30:59.781155603 +0000 UTC m=+818.473976875" Oct 01 10:30:59 crc kubenswrapper[4735]: I1001 10:30:59.792144 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p" podStartSLOduration=3.881274334 podStartE2EDuration="28.792123526s" podCreationTimestamp="2025-10-01 10:30:31 +0000 UTC" firstStartedPulling="2025-10-01 10:30:33.110284715 +0000 UTC m=+791.803105977" lastFinishedPulling="2025-10-01 10:30:58.021133907 +0000 UTC m=+816.713955169" observedRunningTime="2025-10-01 10:30:59.789418211 +0000 UTC m=+818.482239473" watchObservedRunningTime="2025-10-01 10:30:59.792123526 +0000 UTC m=+818.484944788" Oct 01 10:31:11 crc kubenswrapper[4735]: I1001 10:31:11.496922 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-mr6vv" Oct 01 10:31:11 crc kubenswrapper[4735]: I1001 10:31:11.518393 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lg9mq" Oct 01 10:31:11 crc kubenswrapper[4735]: I1001 10:31:11.656201 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-2pfnh" Oct 01 10:31:11 crc kubenswrapper[4735]: I1001 10:31:11.911350 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cllzwc" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.091050 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z6pb7"] Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.093025 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" Oct 01 10:31:32 crc kubenswrapper[4735]: W1001 10:31:32.096652 4735 reflector.go:561] object-"openstack"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 01 10:31:32 crc kubenswrapper[4735]: W1001 10:31:32.096700 4735 reflector.go:561] object-"openstack"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 01 10:31:32 crc kubenswrapper[4735]: E1001 10:31:32.096709 4735 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 10:31:32 crc kubenswrapper[4735]: E1001 10:31:32.096775 4735 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.096726 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.096935 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-m8v87" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.131143 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z6pb7"] Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.180753 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xlnkq"] Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.182255 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.186278 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xlnkq"] Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.186467 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.261715 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c545ee1a-4d5c-4f0d-a44c-723493610cc7-config\") pod \"dnsmasq-dns-675f4bcbfc-z6pb7\" (UID: \"c545ee1a-4d5c-4f0d-a44c-723493610cc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.261924 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb8xl\" (UniqueName: \"kubernetes.io/projected/c545ee1a-4d5c-4f0d-a44c-723493610cc7-kube-api-access-lb8xl\") pod \"dnsmasq-dns-675f4bcbfc-z6pb7\" (UID: \"c545ee1a-4d5c-4f0d-a44c-723493610cc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.363318 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm8pc\" (UniqueName: \"kubernetes.io/projected/e382d805-f31c-4226-a51d-1e464b9d613c-kube-api-access-sm8pc\") pod \"dnsmasq-dns-78dd6ddcc-xlnkq\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.363550 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb8xl\" (UniqueName: \"kubernetes.io/projected/c545ee1a-4d5c-4f0d-a44c-723493610cc7-kube-api-access-lb8xl\") pod \"dnsmasq-dns-675f4bcbfc-z6pb7\" (UID: \"c545ee1a-4d5c-4f0d-a44c-723493610cc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.363860 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-config\") pod \"dnsmasq-dns-78dd6ddcc-xlnkq\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.363974 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c545ee1a-4d5c-4f0d-a44c-723493610cc7-config\") pod \"dnsmasq-dns-675f4bcbfc-z6pb7\" (UID: \"c545ee1a-4d5c-4f0d-a44c-723493610cc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.364036 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xlnkq\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.365160 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c545ee1a-4d5c-4f0d-a44c-723493610cc7-config\") pod \"dnsmasq-dns-675f4bcbfc-z6pb7\" (UID: \"c545ee1a-4d5c-4f0d-a44c-723493610cc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.464935 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xlnkq\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.465005 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm8pc\" (UniqueName: \"kubernetes.io/projected/e382d805-f31c-4226-a51d-1e464b9d613c-kube-api-access-sm8pc\") pod \"dnsmasq-dns-78dd6ddcc-xlnkq\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.465077 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-config\") pod \"dnsmasq-dns-78dd6ddcc-xlnkq\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.465780 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xlnkq\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:32 crc kubenswrapper[4735]: I1001 10:31:32.465820 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-config\") pod \"dnsmasq-dns-78dd6ddcc-xlnkq\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:33 crc kubenswrapper[4735]: I1001 10:31:33.247658 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 01 10:31:33 crc kubenswrapper[4735]: I1001 10:31:33.312862 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 01 10:31:33 crc kubenswrapper[4735]: I1001 10:31:33.322330 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm8pc\" (UniqueName: \"kubernetes.io/projected/e382d805-f31c-4226-a51d-1e464b9d613c-kube-api-access-sm8pc\") pod \"dnsmasq-dns-78dd6ddcc-xlnkq\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:33 crc kubenswrapper[4735]: I1001 10:31:33.322474 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb8xl\" (UniqueName: \"kubernetes.io/projected/c545ee1a-4d5c-4f0d-a44c-723493610cc7-kube-api-access-lb8xl\") pod \"dnsmasq-dns-675f4bcbfc-z6pb7\" (UID: \"c545ee1a-4d5c-4f0d-a44c-723493610cc7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" Oct 01 10:31:33 crc kubenswrapper[4735]: I1001 10:31:33.332566 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" Oct 01 10:31:33 crc kubenswrapper[4735]: I1001 10:31:33.415972 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:33 crc kubenswrapper[4735]: I1001 10:31:33.730033 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z6pb7"] Oct 01 10:31:33 crc kubenswrapper[4735]: I1001 10:31:33.735825 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 10:31:33 crc kubenswrapper[4735]: I1001 10:31:33.820855 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xlnkq"] Oct 01 10:31:33 crc kubenswrapper[4735]: W1001 10:31:33.826440 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode382d805_f31c_4226_a51d_1e464b9d613c.slice/crio-2a32d58570f9124f5774dd97e8eb2b7d883dc3f0fb83ad39f2fd528c5817f38c WatchSource:0}: Error finding container 2a32d58570f9124f5774dd97e8eb2b7d883dc3f0fb83ad39f2fd528c5817f38c: Status 404 returned error can't find the container with id 2a32d58570f9124f5774dd97e8eb2b7d883dc3f0fb83ad39f2fd528c5817f38c Oct 01 10:31:33 crc kubenswrapper[4735]: I1001 10:31:33.995334 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" event={"ID":"e382d805-f31c-4226-a51d-1e464b9d613c","Type":"ContainerStarted","Data":"2a32d58570f9124f5774dd97e8eb2b7d883dc3f0fb83ad39f2fd528c5817f38c"} Oct 01 10:31:33 crc kubenswrapper[4735]: I1001 10:31:33.996194 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" event={"ID":"c545ee1a-4d5c-4f0d-a44c-723493610cc7","Type":"ContainerStarted","Data":"9cea45d165714687ed4483b9b377d2036ae50ce172bf2658bf842dc79b56d0d6"} Oct 01 10:31:34 crc kubenswrapper[4735]: I1001 10:31:34.960840 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z6pb7"] Oct 01 10:31:34 crc kubenswrapper[4735]: I1001 10:31:34.981364 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ldhfc"] Oct 01 10:31:34 crc kubenswrapper[4735]: I1001 10:31:34.983813 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:31:34 crc kubenswrapper[4735]: I1001 10:31:34.996146 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzq46\" (UniqueName: \"kubernetes.io/projected/eae96f88-b581-42fd-b127-bd1e94d4f977-kube-api-access-xzq46\") pod \"dnsmasq-dns-666b6646f7-ldhfc\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:31:34 crc kubenswrapper[4735]: I1001 10:31:34.996271 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-config\") pod \"dnsmasq-dns-666b6646f7-ldhfc\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:31:34 crc kubenswrapper[4735]: I1001 10:31:34.996314 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ldhfc\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.004047 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ldhfc"] Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.100487 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzq46\" (UniqueName: \"kubernetes.io/projected/eae96f88-b581-42fd-b127-bd1e94d4f977-kube-api-access-xzq46\") pod \"dnsmasq-dns-666b6646f7-ldhfc\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.100595 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-config\") pod \"dnsmasq-dns-666b6646f7-ldhfc\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.101138 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ldhfc\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.102804 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ldhfc\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.105379 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-config\") pod \"dnsmasq-dns-666b6646f7-ldhfc\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.130724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzq46\" (UniqueName: \"kubernetes.io/projected/eae96f88-b581-42fd-b127-bd1e94d4f977-kube-api-access-xzq46\") pod \"dnsmasq-dns-666b6646f7-ldhfc\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.281617 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xlnkq"] Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.305987 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.306520 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gdsfg"] Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.307883 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.310221 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gdsfg"] Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.506467 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-config\") pod \"dnsmasq-dns-57d769cc4f-gdsfg\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.506706 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gdsfg\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.506861 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzj6q\" (UniqueName: \"kubernetes.io/projected/dc112b48-aa35-4a2a-a496-ac720211a123-kube-api-access-gzj6q\") pod \"dnsmasq-dns-57d769cc4f-gdsfg\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.608207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-config\") pod \"dnsmasq-dns-57d769cc4f-gdsfg\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.608291 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gdsfg\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.608355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzj6q\" (UniqueName: \"kubernetes.io/projected/dc112b48-aa35-4a2a-a496-ac720211a123-kube-api-access-gzj6q\") pod \"dnsmasq-dns-57d769cc4f-gdsfg\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.609855 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-config\") pod \"dnsmasq-dns-57d769cc4f-gdsfg\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.613652 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gdsfg\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.641887 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzj6q\" (UniqueName: \"kubernetes.io/projected/dc112b48-aa35-4a2a-a496-ac720211a123-kube-api-access-gzj6q\") pod \"dnsmasq-dns-57d769cc4f-gdsfg\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.643838 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:31:35 crc kubenswrapper[4735]: I1001 10:31:35.827940 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ldhfc"] Oct 01 10:31:35 crc kubenswrapper[4735]: W1001 10:31:35.846082 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeae96f88_b581_42fd_b127_bd1e94d4f977.slice/crio-02aee1a70ba097c58ef328262b3276274f2919c5b28935ad41903fb1898326ef WatchSource:0}: Error finding container 02aee1a70ba097c58ef328262b3276274f2919c5b28935ad41903fb1898326ef: Status 404 returned error can't find the container with id 02aee1a70ba097c58ef328262b3276274f2919c5b28935ad41903fb1898326ef Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.028706 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" event={"ID":"eae96f88-b581-42fd-b127-bd1e94d4f977","Type":"ContainerStarted","Data":"02aee1a70ba097c58ef328262b3276274f2919c5b28935ad41903fb1898326ef"} Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.129255 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.130394 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.131672 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.133055 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.134109 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.134235 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.137627 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-m6njq" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.137773 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.137935 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.146844 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gdsfg"] Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.153308 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 10:31:36 crc kubenswrapper[4735]: W1001 10:31:36.168926 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc112b48_aa35_4a2a_a496_ac720211a123.slice/crio-3bc2964d8e62715123e97266b152703e0d2a7fbb19f77d3d8d6affc9c68380de WatchSource:0}: Error finding container 3bc2964d8e62715123e97266b152703e0d2a7fbb19f77d3d8d6affc9c68380de: Status 404 returned error can't find the container with id 3bc2964d8e62715123e97266b152703e0d2a7fbb19f77d3d8d6affc9c68380de Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.321100 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.321137 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74d6671-f7b0-46ae-91d4-ddb09a530249-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.321171 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.321331 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-config-data\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.321385 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.321454 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74d6671-f7b0-46ae-91d4-ddb09a530249-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.321478 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgs2x\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-kube-api-access-pgs2x\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.321525 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.321573 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.321594 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.321657 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.385550 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.388393 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.390701 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.391294 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pl5ft" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.391354 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.391378 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.391384 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.391513 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.392219 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.401152 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423405 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423445 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423472 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423523 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423542 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74d6671-f7b0-46ae-91d4-ddb09a530249-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423578 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423611 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-config-data\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423642 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423665 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74d6671-f7b0-46ae-91d4-ddb09a530249-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423687 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgs2x\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-kube-api-access-pgs2x\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423709 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.423992 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.424740 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.425028 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.426242 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-config-data\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.428074 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.428891 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74d6671-f7b0-46ae-91d4-ddb09a530249-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.440732 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgs2x\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-kube-api-access-pgs2x\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.444127 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.445196 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.449053 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74d6671-f7b0-46ae-91d4-ddb09a530249-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.450266 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.454334 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.484724 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.524734 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlgjh\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-kube-api-access-tlgjh\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.524778 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0be1b363-c0e5-4c73-9359-00032a6c8ab9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.524802 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.524822 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0be1b363-c0e5-4c73-9359-00032a6c8ab9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.524840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.524857 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.525192 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.525280 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.525576 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.525673 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.525785 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.626604 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.626656 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.626728 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.626755 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.626783 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.626801 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlgjh\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-kube-api-access-tlgjh\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.626815 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0be1b363-c0e5-4c73-9359-00032a6c8ab9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.626833 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.626849 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0be1b363-c0e5-4c73-9359-00032a6c8ab9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.626866 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.626880 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.627225 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.627303 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.628635 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.628730 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.631913 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.634392 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.635632 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0be1b363-c0e5-4c73-9359-00032a6c8ab9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.637038 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.642141 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.645987 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0be1b363-c0e5-4c73-9359-00032a6c8ab9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.647791 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlgjh\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-kube-api-access-tlgjh\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.658156 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.716655 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:31:36 crc kubenswrapper[4735]: I1001 10:31:36.923477 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 10:31:37 crc kubenswrapper[4735]: I1001 10:31:37.050447 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74d6671-f7b0-46ae-91d4-ddb09a530249","Type":"ContainerStarted","Data":"83e3bf19b9306ed77e4db32383597ea8f99d5083fbf0b2ab20cbd087be3b5194"} Oct 01 10:31:37 crc kubenswrapper[4735]: I1001 10:31:37.051924 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" event={"ID":"dc112b48-aa35-4a2a-a496-ac720211a123","Type":"ContainerStarted","Data":"3bc2964d8e62715123e97266b152703e0d2a7fbb19f77d3d8d6affc9c68380de"} Oct 01 10:31:37 crc kubenswrapper[4735]: I1001 10:31:37.190611 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 10:31:37 crc kubenswrapper[4735]: W1001 10:31:37.198848 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0be1b363_c0e5_4c73_9359_00032a6c8ab9.slice/crio-7012fdbfac0f507d9765bd62ef994dcf74c327644ac8db74c521b0908d050531 WatchSource:0}: Error finding container 7012fdbfac0f507d9765bd62ef994dcf74c327644ac8db74c521b0908d050531: Status 404 returned error can't find the container with id 7012fdbfac0f507d9765bd62ef994dcf74c327644ac8db74c521b0908d050531 Oct 01 10:31:38 crc kubenswrapper[4735]: I1001 10:31:38.073027 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0be1b363-c0e5-4c73-9359-00032a6c8ab9","Type":"ContainerStarted","Data":"7012fdbfac0f507d9765bd62ef994dcf74c327644ac8db74c521b0908d050531"} Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.009962 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.011280 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.014880 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.017371 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.017587 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.019996 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.020057 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-74dj5" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.021584 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.024447 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.176154 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gncn5\" (UniqueName: \"kubernetes.io/projected/15f57822-7418-47e6-b679-aea87612b3ec-kube-api-access-gncn5\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.176564 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.176588 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/15f57822-7418-47e6-b679-aea87612b3ec-secrets\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.176650 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f57822-7418-47e6-b679-aea87612b3ec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.176796 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/15f57822-7418-47e6-b679-aea87612b3ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.176844 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f57822-7418-47e6-b679-aea87612b3ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.177188 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/15f57822-7418-47e6-b679-aea87612b3ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.177325 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/15f57822-7418-47e6-b679-aea87612b3ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.177408 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f57822-7418-47e6-b679-aea87612b3ec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.278663 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/15f57822-7418-47e6-b679-aea87612b3ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.278756 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f57822-7418-47e6-b679-aea87612b3ec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.278794 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gncn5\" (UniqueName: \"kubernetes.io/projected/15f57822-7418-47e6-b679-aea87612b3ec-kube-api-access-gncn5\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.278838 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.278856 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/15f57822-7418-47e6-b679-aea87612b3ec-secrets\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.278870 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f57822-7418-47e6-b679-aea87612b3ec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.278902 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/15f57822-7418-47e6-b679-aea87612b3ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.278931 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f57822-7418-47e6-b679-aea87612b3ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.278958 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/15f57822-7418-47e6-b679-aea87612b3ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.279476 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.279863 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/15f57822-7418-47e6-b679-aea87612b3ec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.280256 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/15f57822-7418-47e6-b679-aea87612b3ec-config-data-default\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.280798 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/15f57822-7418-47e6-b679-aea87612b3ec-kolla-config\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.281177 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15f57822-7418-47e6-b679-aea87612b3ec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.286756 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/15f57822-7418-47e6-b679-aea87612b3ec-secrets\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.286826 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f57822-7418-47e6-b679-aea87612b3ec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.290505 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f57822-7418-47e6-b679-aea87612b3ec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.293105 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gncn5\" (UniqueName: \"kubernetes.io/projected/15f57822-7418-47e6-b679-aea87612b3ec-kube-api-access-gncn5\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.318701 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"15f57822-7418-47e6-b679-aea87612b3ec\") " pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.336736 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.424559 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.425811 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.428395 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.428647 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.428818 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-92rcw" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.430206 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.434396 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.586049 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8szk\" (UniqueName: \"kubernetes.io/projected/ec099172-9672-4553-94cd-c430818da51d-kube-api-access-z8szk\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.586150 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.586219 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec099172-9672-4553-94cd-c430818da51d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.586246 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec099172-9672-4553-94cd-c430818da51d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.586271 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec099172-9672-4553-94cd-c430818da51d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.586297 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec099172-9672-4553-94cd-c430818da51d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.593807 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec099172-9672-4553-94cd-c430818da51d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.593867 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ec099172-9672-4553-94cd-c430818da51d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.593902 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec099172-9672-4553-94cd-c430818da51d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.697857 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec099172-9672-4553-94cd-c430818da51d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.697902 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec099172-9672-4553-94cd-c430818da51d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.697922 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec099172-9672-4553-94cd-c430818da51d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.697944 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec099172-9672-4553-94cd-c430818da51d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.697963 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ec099172-9672-4553-94cd-c430818da51d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.697980 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec099172-9672-4553-94cd-c430818da51d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.698003 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec099172-9672-4553-94cd-c430818da51d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.698032 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8szk\" (UniqueName: \"kubernetes.io/projected/ec099172-9672-4553-94cd-c430818da51d-kube-api-access-z8szk\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.698077 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.698745 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec099172-9672-4553-94cd-c430818da51d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.698940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec099172-9672-4553-94cd-c430818da51d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.699102 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec099172-9672-4553-94cd-c430818da51d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.699681 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.701614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec099172-9672-4553-94cd-c430818da51d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.702189 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/ec099172-9672-4553-94cd-c430818da51d-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.702799 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec099172-9672-4553-94cd-c430818da51d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.708392 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec099172-9672-4553-94cd-c430818da51d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.719731 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8szk\" (UniqueName: \"kubernetes.io/projected/ec099172-9672-4553-94cd-c430818da51d-kube-api-access-z8szk\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.744618 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ec099172-9672-4553-94cd-c430818da51d\") " pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.745004 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.746893 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.755447 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.755775 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.755960 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nnbnp" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.756070 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.757319 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.908280 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.908389 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-config-data\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.908437 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.908523 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-kolla-config\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:39 crc kubenswrapper[4735]: I1001 10:31:39.908549 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfgwt\" (UniqueName: \"kubernetes.io/projected/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-kube-api-access-jfgwt\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:40 crc kubenswrapper[4735]: I1001 10:31:40.010247 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:40 crc kubenswrapper[4735]: I1001 10:31:40.010319 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-config-data\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:40 crc kubenswrapper[4735]: I1001 10:31:40.010357 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:40 crc kubenswrapper[4735]: I1001 10:31:40.010402 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-kolla-config\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:40 crc kubenswrapper[4735]: I1001 10:31:40.010419 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfgwt\" (UniqueName: \"kubernetes.io/projected/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-kube-api-access-jfgwt\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:40 crc kubenswrapper[4735]: I1001 10:31:40.011985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-config-data\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:40 crc kubenswrapper[4735]: I1001 10:31:40.012778 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-kolla-config\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:40 crc kubenswrapper[4735]: I1001 10:31:40.015579 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:40 crc kubenswrapper[4735]: I1001 10:31:40.015899 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:40 crc kubenswrapper[4735]: I1001 10:31:40.040106 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfgwt\" (UniqueName: \"kubernetes.io/projected/35686057-f8f4-4ef2-8a22-b9de9c15c9e5-kube-api-access-jfgwt\") pod \"memcached-0\" (UID: \"35686057-f8f4-4ef2-8a22-b9de9c15c9e5\") " pod="openstack/memcached-0" Oct 01 10:31:40 crc kubenswrapper[4735]: I1001 10:31:40.113134 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 10:31:41 crc kubenswrapper[4735]: I1001 10:31:41.726381 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 10:31:41 crc kubenswrapper[4735]: I1001 10:31:41.727826 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 10:31:41 crc kubenswrapper[4735]: I1001 10:31:41.730543 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hklpr" Oct 01 10:31:41 crc kubenswrapper[4735]: I1001 10:31:41.734392 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 10:31:41 crc kubenswrapper[4735]: I1001 10:31:41.744930 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wpwh\" (UniqueName: \"kubernetes.io/projected/e1850a74-906c-4ee8-aef6-1e5e32661ac6-kube-api-access-5wpwh\") pod \"kube-state-metrics-0\" (UID: \"e1850a74-906c-4ee8-aef6-1e5e32661ac6\") " pod="openstack/kube-state-metrics-0" Oct 01 10:31:41 crc kubenswrapper[4735]: I1001 10:31:41.847220 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wpwh\" (UniqueName: \"kubernetes.io/projected/e1850a74-906c-4ee8-aef6-1e5e32661ac6-kube-api-access-5wpwh\") pod \"kube-state-metrics-0\" (UID: \"e1850a74-906c-4ee8-aef6-1e5e32661ac6\") " pod="openstack/kube-state-metrics-0" Oct 01 10:31:41 crc kubenswrapper[4735]: I1001 10:31:41.865131 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wpwh\" (UniqueName: \"kubernetes.io/projected/e1850a74-906c-4ee8-aef6-1e5e32661ac6-kube-api-access-5wpwh\") pod \"kube-state-metrics-0\" (UID: \"e1850a74-906c-4ee8-aef6-1e5e32661ac6\") " pod="openstack/kube-state-metrics-0" Oct 01 10:31:42 crc kubenswrapper[4735]: I1001 10:31:42.049800 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 10:31:45 crc kubenswrapper[4735]: I1001 10:31:45.938077 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zmn7b"] Oct 01 10:31:45 crc kubenswrapper[4735]: I1001 10:31:45.939951 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:45 crc kubenswrapper[4735]: I1001 10:31:45.942070 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 01 10:31:45 crc kubenswrapper[4735]: I1001 10:31:45.944905 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4k7wp"] Oct 01 10:31:45 crc kubenswrapper[4735]: I1001 10:31:45.945000 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 01 10:31:45 crc kubenswrapper[4735]: I1001 10:31:45.946532 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:45 crc kubenswrapper[4735]: I1001 10:31:45.953228 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-mm2mv" Oct 01 10:31:45 crc kubenswrapper[4735]: I1001 10:31:45.954507 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zmn7b"] Oct 01 10:31:45 crc kubenswrapper[4735]: I1001 10:31:45.970835 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4k7wp"] Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009167 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-var-run\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009216 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bcc7869-f6b2-4c99-adde-40577b12c99d-var-run\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009293 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bcc7869-f6b2-4c99-adde-40577b12c99d-ovn-controller-tls-certs\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009341 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-var-log\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009370 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bcc7869-f6b2-4c99-adde-40577b12c99d-var-run-ovn\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009416 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43ce7edf-2010-4ccf-ac60-b26606130624-scripts\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009447 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-var-lib\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009510 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bcc7869-f6b2-4c99-adde-40577b12c99d-var-log-ovn\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009563 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-etc-ovs\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009606 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bcc7869-f6b2-4c99-adde-40577b12c99d-combined-ca-bundle\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009630 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bcc7869-f6b2-4c99-adde-40577b12c99d-scripts\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009655 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tf6c\" (UniqueName: \"kubernetes.io/projected/3bcc7869-f6b2-4c99-adde-40577b12c99d-kube-api-access-2tf6c\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.009674 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbq7d\" (UniqueName: \"kubernetes.io/projected/43ce7edf-2010-4ccf-ac60-b26606130624-kube-api-access-pbq7d\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111168 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-var-lib\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bcc7869-f6b2-4c99-adde-40577b12c99d-var-log-ovn\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111267 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-etc-ovs\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111298 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bcc7869-f6b2-4c99-adde-40577b12c99d-combined-ca-bundle\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111323 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bcc7869-f6b2-4c99-adde-40577b12c99d-scripts\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111353 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tf6c\" (UniqueName: \"kubernetes.io/projected/3bcc7869-f6b2-4c99-adde-40577b12c99d-kube-api-access-2tf6c\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111386 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbq7d\" (UniqueName: \"kubernetes.io/projected/43ce7edf-2010-4ccf-ac60-b26606130624-kube-api-access-pbq7d\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111472 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-var-run\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111696 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-var-lib\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111751 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bcc7869-f6b2-4c99-adde-40577b12c99d-var-run\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111788 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bcc7869-f6b2-4c99-adde-40577b12c99d-ovn-controller-tls-certs\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111753 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-etc-ovs\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111812 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-var-log\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111842 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bcc7869-f6b2-4c99-adde-40577b12c99d-var-run-ovn\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111867 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43ce7edf-2010-4ccf-ac60-b26606130624-scripts\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111876 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bcc7869-f6b2-4c99-adde-40577b12c99d-var-run\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.111892 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-var-run\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.112116 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bcc7869-f6b2-4c99-adde-40577b12c99d-var-run-ovn\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.112192 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/43ce7edf-2010-4ccf-ac60-b26606130624-var-log\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.112678 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bcc7869-f6b2-4c99-adde-40577b12c99d-var-log-ovn\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.113888 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43ce7edf-2010-4ccf-ac60-b26606130624-scripts\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.115843 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bcc7869-f6b2-4c99-adde-40577b12c99d-scripts\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.116971 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bcc7869-f6b2-4c99-adde-40577b12c99d-combined-ca-bundle\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.117365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bcc7869-f6b2-4c99-adde-40577b12c99d-ovn-controller-tls-certs\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.126278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tf6c\" (UniqueName: \"kubernetes.io/projected/3bcc7869-f6b2-4c99-adde-40577b12c99d-kube-api-access-2tf6c\") pod \"ovn-controller-zmn7b\" (UID: \"3bcc7869-f6b2-4c99-adde-40577b12c99d\") " pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.129655 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbq7d\" (UniqueName: \"kubernetes.io/projected/43ce7edf-2010-4ccf-ac60-b26606130624-kube-api-access-pbq7d\") pod \"ovn-controller-ovs-4k7wp\" (UID: \"43ce7edf-2010-4ccf-ac60-b26606130624\") " pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.262759 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zmn7b" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.270947 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.554887 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.556107 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.558601 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.558957 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.559208 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-mkvqx" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.559252 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.566028 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.574611 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.721472 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a18104-7c91-4314-99e8-37396ef7c259-config\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.721543 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a18104-7c91-4314-99e8-37396ef7c259-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.721573 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a18104-7c91-4314-99e8-37396ef7c259-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.721626 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.721651 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33a18104-7c91-4314-99e8-37396ef7c259-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.721684 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33a18104-7c91-4314-99e8-37396ef7c259-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.721711 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a18104-7c91-4314-99e8-37396ef7c259-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.721738 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svqwc\" (UniqueName: \"kubernetes.io/projected/33a18104-7c91-4314-99e8-37396ef7c259-kube-api-access-svqwc\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.822985 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.823026 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33a18104-7c91-4314-99e8-37396ef7c259-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.823057 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33a18104-7c91-4314-99e8-37396ef7c259-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.823078 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a18104-7c91-4314-99e8-37396ef7c259-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.823098 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svqwc\" (UniqueName: \"kubernetes.io/projected/33a18104-7c91-4314-99e8-37396ef7c259-kube-api-access-svqwc\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.823165 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a18104-7c91-4314-99e8-37396ef7c259-config\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.823189 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a18104-7c91-4314-99e8-37396ef7c259-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.823221 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a18104-7c91-4314-99e8-37396ef7c259-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.824008 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.824809 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a18104-7c91-4314-99e8-37396ef7c259-config\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.824879 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/33a18104-7c91-4314-99e8-37396ef7c259-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.825203 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33a18104-7c91-4314-99e8-37396ef7c259-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.827438 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a18104-7c91-4314-99e8-37396ef7c259-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.830271 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a18104-7c91-4314-99e8-37396ef7c259-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.830439 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a18104-7c91-4314-99e8-37396ef7c259-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.842469 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svqwc\" (UniqueName: \"kubernetes.io/projected/33a18104-7c91-4314-99e8-37396ef7c259-kube-api-access-svqwc\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.844574 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"33a18104-7c91-4314-99e8-37396ef7c259\") " pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:46 crc kubenswrapper[4735]: I1001 10:31:46.875174 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.571200 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.572578 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.574122 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-fzcq4" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.577386 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.577442 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.577796 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.584088 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.757481 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.757556 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xmxb\" (UniqueName: \"kubernetes.io/projected/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-kube-api-access-9xmxb\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.757607 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.757672 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.757703 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.757739 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-config\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.757778 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.757798 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.858761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.858820 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xmxb\" (UniqueName: \"kubernetes.io/projected/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-kube-api-access-9xmxb\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.858866 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.858929 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.858958 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.858994 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-config\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.859031 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.859057 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.859239 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.859541 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.859874 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-config\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.860325 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.865400 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.865432 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.868100 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.875935 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xmxb\" (UniqueName: \"kubernetes.io/projected/aa211c9f-0ea4-4b46-9f37-c5917dd0d833-kube-api-access-9xmxb\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.882223 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"aa211c9f-0ea4-4b46-9f37-c5917dd0d833\") " pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:48 crc kubenswrapper[4735]: I1001 10:31:48.897390 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.289351 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.290078 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lb8xl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-z6pb7_openstack(c545ee1a-4d5c-4f0d-a44c-723493610cc7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.291288 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" podUID="c545ee1a-4d5c-4f0d-a44c-723493610cc7" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.342868 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.343328 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xzq46,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-ldhfc_openstack(eae96f88-b581-42fd-b127-bd1e94d4f977): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.344568 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" podUID="eae96f88-b581-42fd-b127-bd1e94d4f977" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.364532 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.364707 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gzj6q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-gdsfg_openstack(dc112b48-aa35-4a2a-a496-ac720211a123): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.365832 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" podUID="dc112b48-aa35-4a2a-a496-ac720211a123" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.370190 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.370429 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sm8pc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-xlnkq_openstack(e382d805-f31c-4226-a51d-1e464b9d613c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 10:31:56 crc kubenswrapper[4735]: E1001 10:31:56.371770 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" podUID="e382d805-f31c-4226-a51d-1e464b9d613c" Oct 01 10:31:56 crc kubenswrapper[4735]: I1001 10:31:56.810786 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 10:31:56 crc kubenswrapper[4735]: I1001 10:31:56.819167 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 10:31:56 crc kubenswrapper[4735]: I1001 10:31:56.825431 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 10:31:56 crc kubenswrapper[4735]: W1001 10:31:56.827270 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15f57822_7418_47e6_b679_aea87612b3ec.slice/crio-0d80a10e55aee707a33c3fe11fc6f10596d11fd407e944930277dbd2e84e519f WatchSource:0}: Error finding container 0d80a10e55aee707a33c3fe11fc6f10596d11fd407e944930277dbd2e84e519f: Status 404 returned error can't find the container with id 0d80a10e55aee707a33c3fe11fc6f10596d11fd407e944930277dbd2e84e519f Oct 01 10:31:56 crc kubenswrapper[4735]: I1001 10:31:56.985091 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zmn7b"] Oct 01 10:31:56 crc kubenswrapper[4735]: I1001 10:31:56.992780 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.020211 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 10:31:57 crc kubenswrapper[4735]: W1001 10:31:57.073131 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bcc7869_f6b2_4c99_adde_40577b12c99d.slice/crio-b23d413d89ad2a536cba956dce50043517bac77f2e28c696c8828c2860c47038 WatchSource:0}: Error finding container b23d413d89ad2a536cba956dce50043517bac77f2e28c696c8828c2860c47038: Status 404 returned error can't find the container with id b23d413d89ad2a536cba956dce50043517bac77f2e28c696c8828c2860c47038 Oct 01 10:31:57 crc kubenswrapper[4735]: W1001 10:31:57.074727 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec099172_9672_4553_94cd_c430818da51d.slice/crio-907640d162c71499adbf7c3e3c9d226a01ab71890f3953b4c172e819d4342060 WatchSource:0}: Error finding container 907640d162c71499adbf7c3e3c9d226a01ab71890f3953b4c172e819d4342060: Status 404 returned error can't find the container with id 907640d162c71499adbf7c3e3c9d226a01ab71890f3953b4c172e819d4342060 Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.114000 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 10:31:57 crc kubenswrapper[4735]: W1001 10:31:57.117787 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa211c9f_0ea4_4b46_9f37_c5917dd0d833.slice/crio-411b3cb0481b635a1a5a72738c40a8949ba6092251aa35b7ebfddff84f2126d2 WatchSource:0}: Error finding container 411b3cb0481b635a1a5a72738c40a8949ba6092251aa35b7ebfddff84f2126d2: Status 404 returned error can't find the container with id 411b3cb0481b635a1a5a72738c40a8949ba6092251aa35b7ebfddff84f2126d2 Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.216594 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"aa211c9f-0ea4-4b46-9f37-c5917dd0d833","Type":"ContainerStarted","Data":"411b3cb0481b635a1a5a72738c40a8949ba6092251aa35b7ebfddff84f2126d2"} Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.218050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ec099172-9672-4553-94cd-c430818da51d","Type":"ContainerStarted","Data":"907640d162c71499adbf7c3e3c9d226a01ab71890f3953b4c172e819d4342060"} Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.219252 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"33a18104-7c91-4314-99e8-37396ef7c259","Type":"ContainerStarted","Data":"6678e21c6d630f2d4d199a3c4defc73cbf6272bb71d271b118459c7704873594"} Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.220150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e1850a74-906c-4ee8-aef6-1e5e32661ac6","Type":"ContainerStarted","Data":"ed31792874cf0ee05ef0b2b0d0fbc07a78c965fc43eb52d8c717c6899e4f23d6"} Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.221023 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"35686057-f8f4-4ef2-8a22-b9de9c15c9e5","Type":"ContainerStarted","Data":"33acf0e2aca86d0a11e2307b5621565c7100a32c38d8bea55418a90d37c95ec5"} Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.222003 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"15f57822-7418-47e6-b679-aea87612b3ec","Type":"ContainerStarted","Data":"0d80a10e55aee707a33c3fe11fc6f10596d11fd407e944930277dbd2e84e519f"} Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.223713 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zmn7b" event={"ID":"3bcc7869-f6b2-4c99-adde-40577b12c99d","Type":"ContainerStarted","Data":"b23d413d89ad2a536cba956dce50043517bac77f2e28c696c8828c2860c47038"} Oct 01 10:31:57 crc kubenswrapper[4735]: E1001 10:31:57.225310 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" podUID="dc112b48-aa35-4a2a-a496-ac720211a123" Oct 01 10:31:57 crc kubenswrapper[4735]: E1001 10:31:57.226057 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" podUID="eae96f88-b581-42fd-b127-bd1e94d4f977" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.725466 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.730237 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.804980 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-config\") pod \"e382d805-f31c-4226-a51d-1e464b9d613c\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.805021 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c545ee1a-4d5c-4f0d-a44c-723493610cc7-config\") pod \"c545ee1a-4d5c-4f0d-a44c-723493610cc7\" (UID: \"c545ee1a-4d5c-4f0d-a44c-723493610cc7\") " Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.805039 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-dns-svc\") pod \"e382d805-f31c-4226-a51d-1e464b9d613c\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.805074 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm8pc\" (UniqueName: \"kubernetes.io/projected/e382d805-f31c-4226-a51d-1e464b9d613c-kube-api-access-sm8pc\") pod \"e382d805-f31c-4226-a51d-1e464b9d613c\" (UID: \"e382d805-f31c-4226-a51d-1e464b9d613c\") " Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.805118 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb8xl\" (UniqueName: \"kubernetes.io/projected/c545ee1a-4d5c-4f0d-a44c-723493610cc7-kube-api-access-lb8xl\") pod \"c545ee1a-4d5c-4f0d-a44c-723493610cc7\" (UID: \"c545ee1a-4d5c-4f0d-a44c-723493610cc7\") " Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.805633 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-config" (OuterVolumeSpecName: "config") pod "e382d805-f31c-4226-a51d-1e464b9d613c" (UID: "e382d805-f31c-4226-a51d-1e464b9d613c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.805774 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e382d805-f31c-4226-a51d-1e464b9d613c" (UID: "e382d805-f31c-4226-a51d-1e464b9d613c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.805816 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c545ee1a-4d5c-4f0d-a44c-723493610cc7-config" (OuterVolumeSpecName: "config") pod "c545ee1a-4d5c-4f0d-a44c-723493610cc7" (UID: "c545ee1a-4d5c-4f0d-a44c-723493610cc7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.810001 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c545ee1a-4d5c-4f0d-a44c-723493610cc7-kube-api-access-lb8xl" (OuterVolumeSpecName: "kube-api-access-lb8xl") pod "c545ee1a-4d5c-4f0d-a44c-723493610cc7" (UID: "c545ee1a-4d5c-4f0d-a44c-723493610cc7"). InnerVolumeSpecName "kube-api-access-lb8xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.810160 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e382d805-f31c-4226-a51d-1e464b9d613c-kube-api-access-sm8pc" (OuterVolumeSpecName: "kube-api-access-sm8pc") pod "e382d805-f31c-4226-a51d-1e464b9d613c" (UID: "e382d805-f31c-4226-a51d-1e464b9d613c"). InnerVolumeSpecName "kube-api-access-sm8pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.906347 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.907013 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm8pc\" (UniqueName: \"kubernetes.io/projected/e382d805-f31c-4226-a51d-1e464b9d613c-kube-api-access-sm8pc\") on node \"crc\" DevicePath \"\"" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.907179 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb8xl\" (UniqueName: \"kubernetes.io/projected/c545ee1a-4d5c-4f0d-a44c-723493610cc7-kube-api-access-lb8xl\") on node \"crc\" DevicePath \"\"" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.907215 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e382d805-f31c-4226-a51d-1e464b9d613c-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:31:57 crc kubenswrapper[4735]: I1001 10:31:57.907242 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c545ee1a-4d5c-4f0d-a44c-723493610cc7-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:31:58 crc kubenswrapper[4735]: I1001 10:31:58.119822 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4k7wp"] Oct 01 10:31:58 crc kubenswrapper[4735]: I1001 10:31:58.234459 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" event={"ID":"e382d805-f31c-4226-a51d-1e464b9d613c","Type":"ContainerDied","Data":"2a32d58570f9124f5774dd97e8eb2b7d883dc3f0fb83ad39f2fd528c5817f38c"} Oct 01 10:31:58 crc kubenswrapper[4735]: I1001 10:31:58.234593 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xlnkq" Oct 01 10:31:58 crc kubenswrapper[4735]: I1001 10:31:58.235621 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" Oct 01 10:31:58 crc kubenswrapper[4735]: I1001 10:31:58.235623 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-z6pb7" event={"ID":"c545ee1a-4d5c-4f0d-a44c-723493610cc7","Type":"ContainerDied","Data":"9cea45d165714687ed4483b9b377d2036ae50ce172bf2658bf842dc79b56d0d6"} Oct 01 10:31:58 crc kubenswrapper[4735]: I1001 10:31:58.240009 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74d6671-f7b0-46ae-91d4-ddb09a530249","Type":"ContainerStarted","Data":"99e7412a96e156ecb4e75f45624255a2002f318654941bab406bfe4f4add0f39"} Oct 01 10:31:58 crc kubenswrapper[4735]: I1001 10:31:58.242840 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0be1b363-c0e5-4c73-9359-00032a6c8ab9","Type":"ContainerStarted","Data":"dd7a7fe9aeac4b441d9f64f6405e80c705fceb5363e7a65d51c92409606fc6d2"} Oct 01 10:31:58 crc kubenswrapper[4735]: I1001 10:31:58.302298 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xlnkq"] Oct 01 10:31:58 crc kubenswrapper[4735]: I1001 10:31:58.317349 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xlnkq"] Oct 01 10:31:58 crc kubenswrapper[4735]: I1001 10:31:58.337539 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z6pb7"] Oct 01 10:31:58 crc kubenswrapper[4735]: I1001 10:31:58.347541 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z6pb7"] Oct 01 10:31:58 crc kubenswrapper[4735]: W1001 10:31:58.486190 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43ce7edf_2010_4ccf_ac60_b26606130624.slice/crio-aa8ab854dbc12c352243e2ae561e2eda08ac4423e29f9614b2915ee4ead2758c WatchSource:0}: Error finding container aa8ab854dbc12c352243e2ae561e2eda08ac4423e29f9614b2915ee4ead2758c: Status 404 returned error can't find the container with id aa8ab854dbc12c352243e2ae561e2eda08ac4423e29f9614b2915ee4ead2758c Oct 01 10:31:59 crc kubenswrapper[4735]: I1001 10:31:59.250456 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4k7wp" event={"ID":"43ce7edf-2010-4ccf-ac60-b26606130624","Type":"ContainerStarted","Data":"aa8ab854dbc12c352243e2ae561e2eda08ac4423e29f9614b2915ee4ead2758c"} Oct 01 10:31:59 crc kubenswrapper[4735]: I1001 10:31:59.906509 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c545ee1a-4d5c-4f0d-a44c-723493610cc7" path="/var/lib/kubelet/pods/c545ee1a-4d5c-4f0d-a44c-723493610cc7/volumes" Oct 01 10:31:59 crc kubenswrapper[4735]: I1001 10:31:59.907260 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e382d805-f31c-4226-a51d-1e464b9d613c" path="/var/lib/kubelet/pods/e382d805-f31c-4226-a51d-1e464b9d613c/volumes" Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.297468 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"15f57822-7418-47e6-b679-aea87612b3ec","Type":"ContainerStarted","Data":"ad0237f18fd179bd4daaeaabe7556899bccfd3954afdc02d15005fb56c6ec38a"} Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.299518 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zmn7b" event={"ID":"3bcc7869-f6b2-4c99-adde-40577b12c99d","Type":"ContainerStarted","Data":"69690211ee97175577eab3321a7bb657b182370a3a1f31098682fbe6510361bd"} Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.299612 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-zmn7b" Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.300756 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"aa211c9f-0ea4-4b46-9f37-c5917dd0d833","Type":"ContainerStarted","Data":"4766d785624de62f12ab23878466967ff86d7e90e404d1c4d8409ea9bfc8be15"} Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.301784 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ec099172-9672-4553-94cd-c430818da51d","Type":"ContainerStarted","Data":"1a8438acca352b79082643a26709990a2f48e4b39732f5cd0db65d5828df73e2"} Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.303390 4735 generic.go:334] "Generic (PLEG): container finished" podID="43ce7edf-2010-4ccf-ac60-b26606130624" containerID="95eb515346262ebbb7e38c240695a219ba6b099592a43559332170b3c1e89edd" exitCode=0 Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.303437 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4k7wp" event={"ID":"43ce7edf-2010-4ccf-ac60-b26606130624","Type":"ContainerDied","Data":"95eb515346262ebbb7e38c240695a219ba6b099592a43559332170b3c1e89edd"} Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.305640 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"33a18104-7c91-4314-99e8-37396ef7c259","Type":"ContainerStarted","Data":"568160df703d2cf635724c985a41e762a8d01f0a61d7552fac32d2110904d0a8"} Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.308071 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e1850a74-906c-4ee8-aef6-1e5e32661ac6","Type":"ContainerStarted","Data":"f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9"} Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.308217 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.309561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"35686057-f8f4-4ef2-8a22-b9de9c15c9e5","Type":"ContainerStarted","Data":"edc6ca083d69918424a93310803502455e68616eadc2ea3735c360084ed57a19"} Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.309778 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.337775 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.58062818 podStartE2EDuration="25.337756253s" podCreationTimestamp="2025-10-01 10:31:41 +0000 UTC" firstStartedPulling="2025-10-01 10:31:56.827470757 +0000 UTC m=+875.520292019" lastFinishedPulling="2025-10-01 10:32:05.58459883 +0000 UTC m=+884.277420092" observedRunningTime="2025-10-01 10:32:06.334892404 +0000 UTC m=+885.027713666" watchObservedRunningTime="2025-10-01 10:32:06.337756253 +0000 UTC m=+885.030577515" Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.385983 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.475324485 podStartE2EDuration="27.385962388s" podCreationTimestamp="2025-10-01 10:31:39 +0000 UTC" firstStartedPulling="2025-10-01 10:31:56.820032393 +0000 UTC m=+875.512853655" lastFinishedPulling="2025-10-01 10:32:04.730670296 +0000 UTC m=+883.423491558" observedRunningTime="2025-10-01 10:32:06.373614061 +0000 UTC m=+885.066435323" watchObservedRunningTime="2025-10-01 10:32:06.385962388 +0000 UTC m=+885.078783650" Oct 01 10:32:06 crc kubenswrapper[4735]: I1001 10:32:06.421058 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zmn7b" podStartSLOduration=12.913547395 podStartE2EDuration="21.421038935s" podCreationTimestamp="2025-10-01 10:31:45 +0000 UTC" firstStartedPulling="2025-10-01 10:31:57.075370212 +0000 UTC m=+875.768191474" lastFinishedPulling="2025-10-01 10:32:05.582861732 +0000 UTC m=+884.275683014" observedRunningTime="2025-10-01 10:32:06.414638281 +0000 UTC m=+885.107459543" watchObservedRunningTime="2025-10-01 10:32:06.421038935 +0000 UTC m=+885.113860197" Oct 01 10:32:07 crc kubenswrapper[4735]: I1001 10:32:07.321343 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4k7wp" event={"ID":"43ce7edf-2010-4ccf-ac60-b26606130624","Type":"ContainerStarted","Data":"c64f14083a7b4942af72d0afeedc141966ec955d0ab68c147d424b9c13842381"} Oct 01 10:32:07 crc kubenswrapper[4735]: I1001 10:32:07.321838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4k7wp" event={"ID":"43ce7edf-2010-4ccf-ac60-b26606130624","Type":"ContainerStarted","Data":"6ae0d8b953621684ae148a28856e088ebac163d7d068fb138c4f9bc6c3d85f76"} Oct 01 10:32:07 crc kubenswrapper[4735]: I1001 10:32:07.324063 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:32:07 crc kubenswrapper[4735]: I1001 10:32:07.324091 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:32:07 crc kubenswrapper[4735]: I1001 10:32:07.340745 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4k7wp" podStartSLOduration=15.25609231 podStartE2EDuration="22.340725482s" podCreationTimestamp="2025-10-01 10:31:45 +0000 UTC" firstStartedPulling="2025-10-01 10:31:58.490438427 +0000 UTC m=+877.183259689" lastFinishedPulling="2025-10-01 10:32:05.575071599 +0000 UTC m=+884.267892861" observedRunningTime="2025-10-01 10:32:07.339605721 +0000 UTC m=+886.032426983" watchObservedRunningTime="2025-10-01 10:32:07.340725482 +0000 UTC m=+886.033546764" Oct 01 10:32:09 crc kubenswrapper[4735]: I1001 10:32:09.338051 4735 generic.go:334] "Generic (PLEG): container finished" podID="ec099172-9672-4553-94cd-c430818da51d" containerID="1a8438acca352b79082643a26709990a2f48e4b39732f5cd0db65d5828df73e2" exitCode=0 Oct 01 10:32:09 crc kubenswrapper[4735]: I1001 10:32:09.338149 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ec099172-9672-4553-94cd-c430818da51d","Type":"ContainerDied","Data":"1a8438acca352b79082643a26709990a2f48e4b39732f5cd0db65d5828df73e2"} Oct 01 10:32:09 crc kubenswrapper[4735]: I1001 10:32:09.341286 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"33a18104-7c91-4314-99e8-37396ef7c259","Type":"ContainerStarted","Data":"03d70ea4b7b54836a6ee22a51ef14e5dd438a482e560b5f0544ba559c48c2588"} Oct 01 10:32:09 crc kubenswrapper[4735]: I1001 10:32:09.346181 4735 generic.go:334] "Generic (PLEG): container finished" podID="15f57822-7418-47e6-b679-aea87612b3ec" containerID="ad0237f18fd179bd4daaeaabe7556899bccfd3954afdc02d15005fb56c6ec38a" exitCode=0 Oct 01 10:32:09 crc kubenswrapper[4735]: I1001 10:32:09.346282 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"15f57822-7418-47e6-b679-aea87612b3ec","Type":"ContainerDied","Data":"ad0237f18fd179bd4daaeaabe7556899bccfd3954afdc02d15005fb56c6ec38a"} Oct 01 10:32:09 crc kubenswrapper[4735]: I1001 10:32:09.351643 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"aa211c9f-0ea4-4b46-9f37-c5917dd0d833","Type":"ContainerStarted","Data":"5e77576247dce5b8e90d056a4b914c0c33e0a7e7b8d19f21abe979883b764835"} Oct 01 10:32:09 crc kubenswrapper[4735]: I1001 10:32:09.419719 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.963689543 podStartE2EDuration="22.419703516s" podCreationTimestamp="2025-10-01 10:31:47 +0000 UTC" firstStartedPulling="2025-10-01 10:31:57.119985859 +0000 UTC m=+875.812807121" lastFinishedPulling="2025-10-01 10:32:08.575999832 +0000 UTC m=+887.268821094" observedRunningTime="2025-10-01 10:32:09.417730512 +0000 UTC m=+888.110551814" watchObservedRunningTime="2025-10-01 10:32:09.419703516 +0000 UTC m=+888.112524778" Oct 01 10:32:09 crc kubenswrapper[4735]: I1001 10:32:09.458809 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=12.948739425 podStartE2EDuration="24.458783352s" podCreationTimestamp="2025-10-01 10:31:45 +0000 UTC" firstStartedPulling="2025-10-01 10:31:57.082094845 +0000 UTC m=+875.774916107" lastFinishedPulling="2025-10-01 10:32:08.592138772 +0000 UTC m=+887.284960034" observedRunningTime="2025-10-01 10:32:09.442312822 +0000 UTC m=+888.135134124" watchObservedRunningTime="2025-10-01 10:32:09.458783352 +0000 UTC m=+888.151604624" Oct 01 10:32:09 crc kubenswrapper[4735]: I1001 10:32:09.907114 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 01 10:32:09 crc kubenswrapper[4735]: I1001 10:32:09.932524 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.117279 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.360362 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc112b48-aa35-4a2a-a496-ac720211a123" containerID="391f026922b5798f120920cc50557d9011a61bfc747c256e08ca0b5d6cac8b31" exitCode=0 Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.360491 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" event={"ID":"dc112b48-aa35-4a2a-a496-ac720211a123","Type":"ContainerDied","Data":"391f026922b5798f120920cc50557d9011a61bfc747c256e08ca0b5d6cac8b31"} Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.363691 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ec099172-9672-4553-94cd-c430818da51d","Type":"ContainerStarted","Data":"94aebc102bcc1933e5ae170e6fd8bae372d2b3981dd8550d6a3a75a732726e39"} Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.365811 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"15f57822-7418-47e6-b679-aea87612b3ec","Type":"ContainerStarted","Data":"02d704308e07b725698a26ca3f2197dfa4ba3a5dbb04e3e65a44ea8046c2ce7e"} Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.366308 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.419112 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=24.635961415 podStartE2EDuration="33.419093378s" podCreationTimestamp="2025-10-01 10:31:37 +0000 UTC" firstStartedPulling="2025-10-01 10:31:56.830798487 +0000 UTC m=+875.523619749" lastFinishedPulling="2025-10-01 10:32:05.61393043 +0000 UTC m=+884.306751712" observedRunningTime="2025-10-01 10:32:10.415060528 +0000 UTC m=+889.107881800" watchObservedRunningTime="2025-10-01 10:32:10.419093378 +0000 UTC m=+889.111914630" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.421161 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.440845 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=23.951836755 podStartE2EDuration="32.440831761s" podCreationTimestamp="2025-10-01 10:31:38 +0000 UTC" firstStartedPulling="2025-10-01 10:31:57.078373204 +0000 UTC m=+875.771194466" lastFinishedPulling="2025-10-01 10:32:05.5673682 +0000 UTC m=+884.260189472" observedRunningTime="2025-10-01 10:32:10.439015312 +0000 UTC m=+889.131836574" watchObservedRunningTime="2025-10-01 10:32:10.440831761 +0000 UTC m=+889.133653013" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.678213 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ldhfc"] Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.707662 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vwv82"] Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.713417 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.716175 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.722460 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vwv82"] Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.778716 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4vkk2"] Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.780741 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.783707 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.803935 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9j2g\" (UniqueName: \"kubernetes.io/projected/83a05602-9f43-41cf-af06-9e6d2109e6c9-kube-api-access-k9j2g\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.803996 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83a05602-9f43-41cf-af06-9e6d2109e6c9-combined-ca-bundle\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.804028 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/83a05602-9f43-41cf-af06-9e6d2109e6c9-ovn-rundir\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.804066 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5r5z\" (UniqueName: \"kubernetes.io/projected/dd20020a-5a21-4558-91de-23f3b8d6278f-kube-api-access-d5r5z\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.804092 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-config\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.804120 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83a05602-9f43-41cf-af06-9e6d2109e6c9-config\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.804146 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83a05602-9f43-41cf-af06-9e6d2109e6c9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.804190 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.804222 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/83a05602-9f43-41cf-af06-9e6d2109e6c9-ovs-rundir\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.804248 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.839884 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4vkk2"] Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.879548 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.905805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9j2g\" (UniqueName: \"kubernetes.io/projected/83a05602-9f43-41cf-af06-9e6d2109e6c9-kube-api-access-k9j2g\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.905857 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83a05602-9f43-41cf-af06-9e6d2109e6c9-combined-ca-bundle\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.905886 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/83a05602-9f43-41cf-af06-9e6d2109e6c9-ovn-rundir\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.905920 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5r5z\" (UniqueName: \"kubernetes.io/projected/dd20020a-5a21-4558-91de-23f3b8d6278f-kube-api-access-d5r5z\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.905958 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-config\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.905986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83a05602-9f43-41cf-af06-9e6d2109e6c9-config\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.906016 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83a05602-9f43-41cf-af06-9e6d2109e6c9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.906076 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.906107 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/83a05602-9f43-41cf-af06-9e6d2109e6c9-ovs-rundir\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.906132 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.909208 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83a05602-9f43-41cf-af06-9e6d2109e6c9-config\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.909222 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.909395 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/83a05602-9f43-41cf-af06-9e6d2109e6c9-ovn-rundir\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.909667 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.909747 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/83a05602-9f43-41cf-af06-9e6d2109e6c9-ovs-rundir\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.910519 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-config\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.925065 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.926603 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9j2g\" (UniqueName: \"kubernetes.io/projected/83a05602-9f43-41cf-af06-9e6d2109e6c9-kube-api-access-k9j2g\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.926656 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83a05602-9f43-41cf-af06-9e6d2109e6c9-combined-ca-bundle\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.928902 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83a05602-9f43-41cf-af06-9e6d2109e6c9-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4vkk2\" (UID: \"83a05602-9f43-41cf-af06-9e6d2109e6c9\") " pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.930169 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5r5z\" (UniqueName: \"kubernetes.io/projected/dd20020a-5a21-4558-91de-23f3b8d6278f-kube-api-access-d5r5z\") pod \"dnsmasq-dns-7f896c8c65-vwv82\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.968531 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gdsfg"] Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.993631 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wn8jw"] Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.994882 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:10 crc kubenswrapper[4735]: I1001 10:32:10.997681 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.000799 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wn8jw"] Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.011721 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-config\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.011788 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.011895 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.011924 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lxvm\" (UniqueName: \"kubernetes.io/projected/ed5ea827-1e94-4de0-ba76-958a200fb185-kube-api-access-4lxvm\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.011947 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.039937 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.057926 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.097629 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4vkk2" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.112921 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-config\") pod \"eae96f88-b581-42fd-b127-bd1e94d4f977\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.113386 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-config" (OuterVolumeSpecName: "config") pod "eae96f88-b581-42fd-b127-bd1e94d4f977" (UID: "eae96f88-b581-42fd-b127-bd1e94d4f977"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.114193 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzq46\" (UniqueName: \"kubernetes.io/projected/eae96f88-b581-42fd-b127-bd1e94d4f977-kube-api-access-xzq46\") pod \"eae96f88-b581-42fd-b127-bd1e94d4f977\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.114258 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-dns-svc\") pod \"eae96f88-b581-42fd-b127-bd1e94d4f977\" (UID: \"eae96f88-b581-42fd-b127-bd1e94d4f977\") " Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.114593 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-config\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.114680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.114819 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.114852 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lxvm\" (UniqueName: \"kubernetes.io/projected/ed5ea827-1e94-4de0-ba76-958a200fb185-kube-api-access-4lxvm\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.114877 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.115000 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.115751 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.116354 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eae96f88-b581-42fd-b127-bd1e94d4f977" (UID: "eae96f88-b581-42fd-b127-bd1e94d4f977"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.116359 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-config\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.116616 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.116956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.117986 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae96f88-b581-42fd-b127-bd1e94d4f977-kube-api-access-xzq46" (OuterVolumeSpecName: "kube-api-access-xzq46") pod "eae96f88-b581-42fd-b127-bd1e94d4f977" (UID: "eae96f88-b581-42fd-b127-bd1e94d4f977"). InnerVolumeSpecName "kube-api-access-xzq46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.135572 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lxvm\" (UniqueName: \"kubernetes.io/projected/ed5ea827-1e94-4de0-ba76-958a200fb185-kube-api-access-4lxvm\") pod \"dnsmasq-dns-86db49b7ff-wn8jw\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.217181 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzq46\" (UniqueName: \"kubernetes.io/projected/eae96f88-b581-42fd-b127-bd1e94d4f977-kube-api-access-xzq46\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.217212 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eae96f88-b581-42fd-b127-bd1e94d4f977-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.359487 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.374937 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" event={"ID":"eae96f88-b581-42fd-b127-bd1e94d4f977","Type":"ContainerDied","Data":"02aee1a70ba097c58ef328262b3276274f2919c5b28935ad41903fb1898326ef"} Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.375022 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ldhfc" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.378642 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" podUID="dc112b48-aa35-4a2a-a496-ac720211a123" containerName="dnsmasq-dns" containerID="cri-o://45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c" gracePeriod=10 Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.378760 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" event={"ID":"dc112b48-aa35-4a2a-a496-ac720211a123","Type":"ContainerStarted","Data":"45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c"} Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.378833 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.380054 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.400322 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" podStartSLOduration=3.241818537 podStartE2EDuration="36.400300394s" podCreationTimestamp="2025-10-01 10:31:35 +0000 UTC" firstStartedPulling="2025-10-01 10:31:36.177055072 +0000 UTC m=+854.869876334" lastFinishedPulling="2025-10-01 10:32:09.335536929 +0000 UTC m=+888.028358191" observedRunningTime="2025-10-01 10:32:11.398596857 +0000 UTC m=+890.091418119" watchObservedRunningTime="2025-10-01 10:32:11.400300394 +0000 UTC m=+890.093121676" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.433358 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.442402 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ldhfc"] Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.447799 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ldhfc"] Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.512762 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vwv82"] Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.576421 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.582595 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.584716 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.584931 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.585257 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.588459 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5hw6z" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.589175 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.624064 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wn8jw"] Oct 01 10:32:11 crc kubenswrapper[4735]: W1001 10:32:11.626333 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded5ea827_1e94_4de0_ba76_958a200fb185.slice/crio-d94df195d73de0766dcaa45b046599fca0247d7fc4f59535151b4b3a8122f7fa WatchSource:0}: Error finding container d94df195d73de0766dcaa45b046599fca0247d7fc4f59535151b4b3a8122f7fa: Status 404 returned error can't find the container with id d94df195d73de0766dcaa45b046599fca0247d7fc4f59535151b4b3a8122f7fa Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.628912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/356e4644-7f54-4b34-b72a-510958be19e5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.628929 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4vkk2"] Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.628999 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n8qp\" (UniqueName: \"kubernetes.io/projected/356e4644-7f54-4b34-b72a-510958be19e5-kube-api-access-4n8qp\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.629116 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/356e4644-7f54-4b34-b72a-510958be19e5-scripts\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.629176 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356e4644-7f54-4b34-b72a-510958be19e5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.629206 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/356e4644-7f54-4b34-b72a-510958be19e5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.629249 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/356e4644-7f54-4b34-b72a-510958be19e5-config\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.629273 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/356e4644-7f54-4b34-b72a-510958be19e5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: W1001 10:32:11.634600 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83a05602_9f43_41cf_af06_9e6d2109e6c9.slice/crio-62482765a0f3eef4cd184cecd44f61104d59d2f226d28a05d6370cb78ea11dc2 WatchSource:0}: Error finding container 62482765a0f3eef4cd184cecd44f61104d59d2f226d28a05d6370cb78ea11dc2: Status 404 returned error can't find the container with id 62482765a0f3eef4cd184cecd44f61104d59d2f226d28a05d6370cb78ea11dc2 Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.730141 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356e4644-7f54-4b34-b72a-510958be19e5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.730180 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/356e4644-7f54-4b34-b72a-510958be19e5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.730221 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/356e4644-7f54-4b34-b72a-510958be19e5-config\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.730242 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/356e4644-7f54-4b34-b72a-510958be19e5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.730299 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/356e4644-7f54-4b34-b72a-510958be19e5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.730322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n8qp\" (UniqueName: \"kubernetes.io/projected/356e4644-7f54-4b34-b72a-510958be19e5-kube-api-access-4n8qp\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.730355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/356e4644-7f54-4b34-b72a-510958be19e5-scripts\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.730982 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/356e4644-7f54-4b34-b72a-510958be19e5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.731141 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/356e4644-7f54-4b34-b72a-510958be19e5-scripts\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.731912 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/356e4644-7f54-4b34-b72a-510958be19e5-config\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.741309 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/356e4644-7f54-4b34-b72a-510958be19e5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.742929 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356e4644-7f54-4b34-b72a-510958be19e5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.747674 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/356e4644-7f54-4b34-b72a-510958be19e5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.751466 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n8qp\" (UniqueName: \"kubernetes.io/projected/356e4644-7f54-4b34-b72a-510958be19e5-kube-api-access-4n8qp\") pod \"ovn-northd-0\" (UID: \"356e4644-7f54-4b34-b72a-510958be19e5\") " pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.898328 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 10:32:11 crc kubenswrapper[4735]: I1001 10:32:11.918360 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eae96f88-b581-42fd-b127-bd1e94d4f977" path="/var/lib/kubelet/pods/eae96f88-b581-42fd-b127-bd1e94d4f977/volumes" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.047377 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.080670 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.097152 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vwv82"] Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.124014 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-mf77b"] Oct 01 10:32:12 crc kubenswrapper[4735]: E1001 10:32:12.124294 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc112b48-aa35-4a2a-a496-ac720211a123" containerName="init" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.124306 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc112b48-aa35-4a2a-a496-ac720211a123" containerName="init" Oct 01 10:32:12 crc kubenswrapper[4735]: E1001 10:32:12.124320 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc112b48-aa35-4a2a-a496-ac720211a123" containerName="dnsmasq-dns" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.124326 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc112b48-aa35-4a2a-a496-ac720211a123" containerName="dnsmasq-dns" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.124481 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc112b48-aa35-4a2a-a496-ac720211a123" containerName="dnsmasq-dns" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.165657 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.175288 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mf77b"] Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.241451 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzj6q\" (UniqueName: \"kubernetes.io/projected/dc112b48-aa35-4a2a-a496-ac720211a123-kube-api-access-gzj6q\") pod \"dc112b48-aa35-4a2a-a496-ac720211a123\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.241672 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-dns-svc\") pod \"dc112b48-aa35-4a2a-a496-ac720211a123\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.241759 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-config\") pod \"dc112b48-aa35-4a2a-a496-ac720211a123\" (UID: \"dc112b48-aa35-4a2a-a496-ac720211a123\") " Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.247346 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc112b48-aa35-4a2a-a496-ac720211a123-kube-api-access-gzj6q" (OuterVolumeSpecName: "kube-api-access-gzj6q") pod "dc112b48-aa35-4a2a-a496-ac720211a123" (UID: "dc112b48-aa35-4a2a-a496-ac720211a123"). InnerVolumeSpecName "kube-api-access-gzj6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.289122 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-config" (OuterVolumeSpecName: "config") pod "dc112b48-aa35-4a2a-a496-ac720211a123" (UID: "dc112b48-aa35-4a2a-a496-ac720211a123"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.297079 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dc112b48-aa35-4a2a-a496-ac720211a123" (UID: "dc112b48-aa35-4a2a-a496-ac720211a123"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.346298 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-config\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.346368 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-dns-svc\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.346424 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rlx\" (UniqueName: \"kubernetes.io/projected/f9f055a8-fec6-40c5-bc84-b1455265886a-kube-api-access-94rlx\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.346707 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.346785 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.347081 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzj6q\" (UniqueName: \"kubernetes.io/projected/dc112b48-aa35-4a2a-a496-ac720211a123-kube-api-access-gzj6q\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.347120 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.347131 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc112b48-aa35-4a2a-a496-ac720211a123-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.386694 4735 generic.go:334] "Generic (PLEG): container finished" podID="ed5ea827-1e94-4de0-ba76-958a200fb185" containerID="bfb064342e3579a661318b2308b767f8f14fa83c4dba82c5fb13dabb5d813280" exitCode=0 Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.386760 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" event={"ID":"ed5ea827-1e94-4de0-ba76-958a200fb185","Type":"ContainerDied","Data":"bfb064342e3579a661318b2308b767f8f14fa83c4dba82c5fb13dabb5d813280"} Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.386784 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" event={"ID":"ed5ea827-1e94-4de0-ba76-958a200fb185","Type":"ContainerStarted","Data":"d94df195d73de0766dcaa45b046599fca0247d7fc4f59535151b4b3a8122f7fa"} Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.390744 4735 generic.go:334] "Generic (PLEG): container finished" podID="dc112b48-aa35-4a2a-a496-ac720211a123" containerID="45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c" exitCode=0 Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.390796 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" event={"ID":"dc112b48-aa35-4a2a-a496-ac720211a123","Type":"ContainerDied","Data":"45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c"} Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.390819 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" event={"ID":"dc112b48-aa35-4a2a-a496-ac720211a123","Type":"ContainerDied","Data":"3bc2964d8e62715123e97266b152703e0d2a7fbb19f77d3d8d6affc9c68380de"} Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.390834 4735 scope.go:117] "RemoveContainer" containerID="45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.390936 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gdsfg" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.393334 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4vkk2" event={"ID":"83a05602-9f43-41cf-af06-9e6d2109e6c9","Type":"ContainerStarted","Data":"47d4478650c71483d5851f657c67e0ca25301d45e873a515fc88bef8fbf51dd7"} Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.393360 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4vkk2" event={"ID":"83a05602-9f43-41cf-af06-9e6d2109e6c9","Type":"ContainerStarted","Data":"62482765a0f3eef4cd184cecd44f61104d59d2f226d28a05d6370cb78ea11dc2"} Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.394634 4735 generic.go:334] "Generic (PLEG): container finished" podID="dd20020a-5a21-4558-91de-23f3b8d6278f" containerID="f3331e7b7d737a344c4e88af67d5b2da3e2d3928a3917aaf97aaae9d1b56858a" exitCode=0 Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.394694 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" event={"ID":"dd20020a-5a21-4558-91de-23f3b8d6278f","Type":"ContainerDied","Data":"f3331e7b7d737a344c4e88af67d5b2da3e2d3928a3917aaf97aaae9d1b56858a"} Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.394759 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" event={"ID":"dd20020a-5a21-4558-91de-23f3b8d6278f","Type":"ContainerStarted","Data":"a1530b0dd8be7e28d40ae4c5c1f65d57fbb0d80fd177817625fc574e15b233a5"} Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.425614 4735 scope.go:117] "RemoveContainer" containerID="391f026922b5798f120920cc50557d9011a61bfc747c256e08ca0b5d6cac8b31" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.428109 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 10:32:12 crc kubenswrapper[4735]: W1001 10:32:12.432623 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod356e4644_7f54_4b34_b72a_510958be19e5.slice/crio-979a49f0a3c9a0abfb83f358d09184ebff21850386e18ff83a20aa30176f6a88 WatchSource:0}: Error finding container 979a49f0a3c9a0abfb83f358d09184ebff21850386e18ff83a20aa30176f6a88: Status 404 returned error can't find the container with id 979a49f0a3c9a0abfb83f358d09184ebff21850386e18ff83a20aa30176f6a88 Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.448153 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.448194 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.448253 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-config\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.448278 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-dns-svc\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.448296 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94rlx\" (UniqueName: \"kubernetes.io/projected/f9f055a8-fec6-40c5-bc84-b1455265886a-kube-api-access-94rlx\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.449416 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.449930 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-config\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.450697 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4vkk2" podStartSLOduration=2.450685548 podStartE2EDuration="2.450685548s" podCreationTimestamp="2025-10-01 10:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:32:12.425408189 +0000 UTC m=+891.118229471" watchObservedRunningTime="2025-10-01 10:32:12.450685548 +0000 UTC m=+891.143506800" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.451178 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.451882 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-dns-svc\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.467092 4735 scope.go:117] "RemoveContainer" containerID="45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c" Oct 01 10:32:12 crc kubenswrapper[4735]: E1001 10:32:12.467771 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c\": container with ID starting with 45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c not found: ID does not exist" containerID="45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.467807 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c"} err="failed to get container status \"45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c\": rpc error: code = NotFound desc = could not find container \"45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c\": container with ID starting with 45088c109f8f888c8ea2b0475af4cbfc60c27c684bf231e144f83aac7184b13c not found: ID does not exist" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.467830 4735 scope.go:117] "RemoveContainer" containerID="391f026922b5798f120920cc50557d9011a61bfc747c256e08ca0b5d6cac8b31" Oct 01 10:32:12 crc kubenswrapper[4735]: E1001 10:32:12.470466 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"391f026922b5798f120920cc50557d9011a61bfc747c256e08ca0b5d6cac8b31\": container with ID starting with 391f026922b5798f120920cc50557d9011a61bfc747c256e08ca0b5d6cac8b31 not found: ID does not exist" containerID="391f026922b5798f120920cc50557d9011a61bfc747c256e08ca0b5d6cac8b31" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.470530 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391f026922b5798f120920cc50557d9011a61bfc747c256e08ca0b5d6cac8b31"} err="failed to get container status \"391f026922b5798f120920cc50557d9011a61bfc747c256e08ca0b5d6cac8b31\": rpc error: code = NotFound desc = could not find container \"391f026922b5798f120920cc50557d9011a61bfc747c256e08ca0b5d6cac8b31\": container with ID starting with 391f026922b5798f120920cc50557d9011a61bfc747c256e08ca0b5d6cac8b31 not found: ID does not exist" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.471760 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rlx\" (UniqueName: \"kubernetes.io/projected/f9f055a8-fec6-40c5-bc84-b1455265886a-kube-api-access-94rlx\") pod \"dnsmasq-dns-698758b865-mf77b\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.474455 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gdsfg"] Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.478118 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gdsfg"] Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.488388 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:12 crc kubenswrapper[4735]: E1001 10:32:12.597164 4735 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 01 10:32:12 crc kubenswrapper[4735]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/ed5ea827-1e94-4de0-ba76-958a200fb185/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 10:32:12 crc kubenswrapper[4735]: > podSandboxID="d94df195d73de0766dcaa45b046599fca0247d7fc4f59535151b4b3a8122f7fa" Oct 01 10:32:12 crc kubenswrapper[4735]: E1001 10:32:12.597330 4735 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 01 10:32:12 crc kubenswrapper[4735]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4lxvm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-wn8jw_openstack(ed5ea827-1e94-4de0-ba76-958a200fb185): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/ed5ea827-1e94-4de0-ba76-958a200fb185/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 10:32:12 crc kubenswrapper[4735]: > logger="UnhandledError" Oct 01 10:32:12 crc kubenswrapper[4735]: E1001 10:32:12.598567 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/ed5ea827-1e94-4de0-ba76-958a200fb185/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" podUID="ed5ea827-1e94-4de0-ba76-958a200fb185" Oct 01 10:32:12 crc kubenswrapper[4735]: I1001 10:32:12.914061 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mf77b"] Oct 01 10:32:12 crc kubenswrapper[4735]: W1001 10:32:12.929403 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9f055a8_fec6_40c5_bc84_b1455265886a.slice/crio-31931eef4ab0d0b727c15bddbdfee3c3bfe9a29e3deec7c902948549bff22a54 WatchSource:0}: Error finding container 31931eef4ab0d0b727c15bddbdfee3c3bfe9a29e3deec7c902948549bff22a54: Status 404 returned error can't find the container with id 31931eef4ab0d0b727c15bddbdfee3c3bfe9a29e3deec7c902948549bff22a54 Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.206658 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.213857 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.215351 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-pg2tw" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.215735 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.215895 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.216405 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.224197 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.363058 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.363125 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/651abe6c-1b2e-4652-a985-74f6cf2c7e17-lock\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.363151 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.363326 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/651abe6c-1b2e-4652-a985-74f6cf2c7e17-cache\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.363445 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbrrz\" (UniqueName: \"kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-kube-api-access-zbrrz\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.415465 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" event={"ID":"dd20020a-5a21-4558-91de-23f3b8d6278f","Type":"ContainerStarted","Data":"9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7"} Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.415610 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.415621 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" podUID="dd20020a-5a21-4558-91de-23f3b8d6278f" containerName="dnsmasq-dns" containerID="cri-o://9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7" gracePeriod=10 Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.423919 4735 generic.go:334] "Generic (PLEG): container finished" podID="f9f055a8-fec6-40c5-bc84-b1455265886a" containerID="866068884d9e51446ca7596b461d363f32ee8e78d6e77b0247ee3bf1a32d52ee" exitCode=0 Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.423987 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mf77b" event={"ID":"f9f055a8-fec6-40c5-bc84-b1455265886a","Type":"ContainerDied","Data":"866068884d9e51446ca7596b461d363f32ee8e78d6e77b0247ee3bf1a32d52ee"} Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.424016 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mf77b" event={"ID":"f9f055a8-fec6-40c5-bc84-b1455265886a","Type":"ContainerStarted","Data":"31931eef4ab0d0b727c15bddbdfee3c3bfe9a29e3deec7c902948549bff22a54"} Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.426730 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"356e4644-7f54-4b34-b72a-510958be19e5","Type":"ContainerStarted","Data":"979a49f0a3c9a0abfb83f358d09184ebff21850386e18ff83a20aa30176f6a88"} Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.438746 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" podStartSLOduration=3.438719841 podStartE2EDuration="3.438719841s" podCreationTimestamp="2025-10-01 10:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:32:13.430338352 +0000 UTC m=+892.123159614" watchObservedRunningTime="2025-10-01 10:32:13.438719841 +0000 UTC m=+892.131541103" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.465584 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.465650 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/651abe6c-1b2e-4652-a985-74f6cf2c7e17-lock\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.465671 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.465697 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/651abe6c-1b2e-4652-a985-74f6cf2c7e17-cache\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.465729 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbrrz\" (UniqueName: \"kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-kube-api-access-zbrrz\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.466272 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.466442 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/651abe6c-1b2e-4652-a985-74f6cf2c7e17-lock\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.466462 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/651abe6c-1b2e-4652-a985-74f6cf2c7e17-cache\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: E1001 10:32:13.466812 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 10:32:13 crc kubenswrapper[4735]: E1001 10:32:13.466880 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 10:32:13 crc kubenswrapper[4735]: E1001 10:32:13.471653 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift podName:651abe6c-1b2e-4652-a985-74f6cf2c7e17 nodeName:}" failed. No retries permitted until 2025-10-01 10:32:13.971629529 +0000 UTC m=+892.664450791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift") pod "swift-storage-0" (UID: "651abe6c-1b2e-4652-a985-74f6cf2c7e17") : configmap "swift-ring-files" not found Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.491916 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbrrz\" (UniqueName: \"kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-kube-api-access-zbrrz\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.542847 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: E1001 10:32:13.554302 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd20020a_5a21_4558_91de_23f3b8d6278f.slice/crio-conmon-9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd20020a_5a21_4558_91de_23f3b8d6278f.slice/crio-9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7.scope\": RecentStats: unable to find data in memory cache]" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.706875 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-clfk4"] Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.708735 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.714321 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.714436 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.714450 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.740019 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-clfk4"] Oct 01 10:32:13 crc kubenswrapper[4735]: E1001 10:32:13.741467 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-z7mlp ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-clfk4" podUID="60de9ada-2ec6-4478-ac16-233568de0ba4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.748550 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-8gpx2"] Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.749865 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.753126 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-clfk4"] Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.760706 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-8gpx2"] Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.776624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7mlp\" (UniqueName: \"kubernetes.io/projected/60de9ada-2ec6-4478-ac16-233568de0ba4-kube-api-access-z7mlp\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.777296 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-combined-ca-bundle\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.780473 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-dispersionconf\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.783396 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-ring-data-devices\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.783478 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-scripts\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.783971 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60de9ada-2ec6-4478-ac16-233568de0ba4-etc-swift\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.784047 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-swiftconf\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.879590 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.885283 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-ring-data-devices\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.885333 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-dispersionconf\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.885363 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-scripts\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.885418 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-combined-ca-bundle\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.885457 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-swiftconf\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.885529 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-ring-data-devices\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.885555 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-scripts\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.885731 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60de9ada-2ec6-4478-ac16-233568de0ba4-etc-swift\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.885762 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-swiftconf\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.885803 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7mlp\" (UniqueName: \"kubernetes.io/projected/60de9ada-2ec6-4478-ac16-233568de0ba4-kube-api-access-z7mlp\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.885885 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-combined-ca-bundle\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.886324 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5dd8db9-b427-4510-b6ab-82883a128fa2-etc-swift\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.886354 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-dispersionconf\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.886374 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwdqz\" (UniqueName: \"kubernetes.io/projected/b5dd8db9-b427-4510-b6ab-82883a128fa2-kube-api-access-zwdqz\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.886745 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-scripts\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.886885 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-ring-data-devices\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.887146 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60de9ada-2ec6-4478-ac16-233568de0ba4-etc-swift\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.891633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-combined-ca-bundle\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.895059 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-swiftconf\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.899922 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-dispersionconf\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.911866 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7mlp\" (UniqueName: \"kubernetes.io/projected/60de9ada-2ec6-4478-ac16-233568de0ba4-kube-api-access-z7mlp\") pod \"swift-ring-rebalance-clfk4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.938359 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc112b48-aa35-4a2a-a496-ac720211a123" path="/var/lib/kubelet/pods/dc112b48-aa35-4a2a-a496-ac720211a123/volumes" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.987725 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-dns-svc\") pod \"dd20020a-5a21-4558-91de-23f3b8d6278f\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.987795 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-ovsdbserver-sb\") pod \"dd20020a-5a21-4558-91de-23f3b8d6278f\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.987838 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-config\") pod \"dd20020a-5a21-4558-91de-23f3b8d6278f\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.987919 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5r5z\" (UniqueName: \"kubernetes.io/projected/dd20020a-5a21-4558-91de-23f3b8d6278f-kube-api-access-d5r5z\") pod \"dd20020a-5a21-4558-91de-23f3b8d6278f\" (UID: \"dd20020a-5a21-4558-91de-23f3b8d6278f\") " Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.988075 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-ring-data-devices\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.988099 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-scripts\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.988150 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.988188 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5dd8db9-b427-4510-b6ab-82883a128fa2-etc-swift\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.988206 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwdqz\" (UniqueName: \"kubernetes.io/projected/b5dd8db9-b427-4510-b6ab-82883a128fa2-kube-api-access-zwdqz\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.988232 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-dispersionconf\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.988278 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-combined-ca-bundle\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.988300 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-swiftconf\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: E1001 10:32:13.989723 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 10:32:13 crc kubenswrapper[4735]: E1001 10:32:13.989758 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 10:32:13 crc kubenswrapper[4735]: E1001 10:32:13.989820 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift podName:651abe6c-1b2e-4652-a985-74f6cf2c7e17 nodeName:}" failed. No retries permitted until 2025-10-01 10:32:14.989800009 +0000 UTC m=+893.682621271 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift") pod "swift-storage-0" (UID: "651abe6c-1b2e-4652-a985-74f6cf2c7e17") : configmap "swift-ring-files" not found Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.992833 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-ring-data-devices\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.993766 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-scripts\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.996432 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5dd8db9-b427-4510-b6ab-82883a128fa2-etc-swift\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:13 crc kubenswrapper[4735]: I1001 10:32:13.998122 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-combined-ca-bundle\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.002051 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd20020a-5a21-4558-91de-23f3b8d6278f-kube-api-access-d5r5z" (OuterVolumeSpecName: "kube-api-access-d5r5z") pod "dd20020a-5a21-4558-91de-23f3b8d6278f" (UID: "dd20020a-5a21-4558-91de-23f3b8d6278f"). InnerVolumeSpecName "kube-api-access-d5r5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.009475 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-dispersionconf\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.012879 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-swiftconf\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.029405 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwdqz\" (UniqueName: \"kubernetes.io/projected/b5dd8db9-b427-4510-b6ab-82883a128fa2-kube-api-access-zwdqz\") pod \"swift-ring-rebalance-8gpx2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.051303 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-config" (OuterVolumeSpecName: "config") pod "dd20020a-5a21-4558-91de-23f3b8d6278f" (UID: "dd20020a-5a21-4558-91de-23f3b8d6278f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.055682 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd20020a-5a21-4558-91de-23f3b8d6278f" (UID: "dd20020a-5a21-4558-91de-23f3b8d6278f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.056174 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dd20020a-5a21-4558-91de-23f3b8d6278f" (UID: "dd20020a-5a21-4558-91de-23f3b8d6278f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.072185 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.090220 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5r5z\" (UniqueName: \"kubernetes.io/projected/dd20020a-5a21-4558-91de-23f3b8d6278f-kube-api-access-d5r5z\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.090253 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.090264 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.090271 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd20020a-5a21-4558-91de-23f3b8d6278f-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.435256 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"356e4644-7f54-4b34-b72a-510958be19e5","Type":"ContainerStarted","Data":"51b5866e57242c2a31b313f9ee070a1edf1dbead6dc1810279df4caa9341d65f"} Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.435574 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"356e4644-7f54-4b34-b72a-510958be19e5","Type":"ContainerStarted","Data":"5f7fecf717236288399ff12ce003d14d3f909eb6918e8245b0e79c4a8b0a58ce"} Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.435588 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.438904 4735 generic.go:334] "Generic (PLEG): container finished" podID="dd20020a-5a21-4558-91de-23f3b8d6278f" containerID="9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7" exitCode=0 Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.438931 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.438996 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" event={"ID":"dd20020a-5a21-4558-91de-23f3b8d6278f","Type":"ContainerDied","Data":"9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7"} Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.439034 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vwv82" event={"ID":"dd20020a-5a21-4558-91de-23f3b8d6278f","Type":"ContainerDied","Data":"a1530b0dd8be7e28d40ae4c5c1f65d57fbb0d80fd177817625fc574e15b233a5"} Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.439055 4735 scope.go:117] "RemoveContainer" containerID="9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.443085 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" event={"ID":"ed5ea827-1e94-4de0-ba76-958a200fb185","Type":"ContainerStarted","Data":"4263a049dc0db4c56b43cde5d2ae9e531c8b37b864d7c5b9864aceb58d9f3dc4"} Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.443892 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.446378 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.446424 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mf77b" event={"ID":"f9f055a8-fec6-40c5-bc84-b1455265886a","Type":"ContainerStarted","Data":"8b6f27691cdc4fedafcdfe885dda34d3cf783172d1ee8c703ef17fe95bcdf0dc"} Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.446968 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.460929 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.464917 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.302307247 podStartE2EDuration="3.464896413s" podCreationTimestamp="2025-10-01 10:32:11 +0000 UTC" firstStartedPulling="2025-10-01 10:32:12.435180075 +0000 UTC m=+891.128001337" lastFinishedPulling="2025-10-01 10:32:13.597769241 +0000 UTC m=+892.290590503" observedRunningTime="2025-10-01 10:32:14.45519886 +0000 UTC m=+893.148020132" watchObservedRunningTime="2025-10-01 10:32:14.464896413 +0000 UTC m=+893.157717685" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.473659 4735 scope.go:117] "RemoveContainer" containerID="f3331e7b7d737a344c4e88af67d5b2da3e2d3928a3917aaf97aaae9d1b56858a" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.479665 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" podStartSLOduration=4.479647716 podStartE2EDuration="4.479647716s" podCreationTimestamp="2025-10-01 10:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:32:14.474884017 +0000 UTC m=+893.167705279" watchObservedRunningTime="2025-10-01 10:32:14.479647716 +0000 UTC m=+893.172468998" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.497751 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-mf77b" podStartSLOduration=2.4977350400000002 podStartE2EDuration="2.49773504s" podCreationTimestamp="2025-10-01 10:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:32:14.495864069 +0000 UTC m=+893.188685341" watchObservedRunningTime="2025-10-01 10:32:14.49773504 +0000 UTC m=+893.190556312" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.512146 4735 scope.go:117] "RemoveContainer" containerID="9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7" Oct 01 10:32:14 crc kubenswrapper[4735]: E1001 10:32:14.515354 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7\": container with ID starting with 9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7 not found: ID does not exist" containerID="9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.515414 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7"} err="failed to get container status \"9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7\": rpc error: code = NotFound desc = could not find container \"9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7\": container with ID starting with 9f9132b258244fb5d4797e9e1c54234945f77014990612b65e0d89dd9450e4d7 not found: ID does not exist" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.515446 4735 scope.go:117] "RemoveContainer" containerID="f3331e7b7d737a344c4e88af67d5b2da3e2d3928a3917aaf97aaae9d1b56858a" Oct 01 10:32:14 crc kubenswrapper[4735]: E1001 10:32:14.516265 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3331e7b7d737a344c4e88af67d5b2da3e2d3928a3917aaf97aaae9d1b56858a\": container with ID starting with f3331e7b7d737a344c4e88af67d5b2da3e2d3928a3917aaf97aaae9d1b56858a not found: ID does not exist" containerID="f3331e7b7d737a344c4e88af67d5b2da3e2d3928a3917aaf97aaae9d1b56858a" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.516298 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3331e7b7d737a344c4e88af67d5b2da3e2d3928a3917aaf97aaae9d1b56858a"} err="failed to get container status \"f3331e7b7d737a344c4e88af67d5b2da3e2d3928a3917aaf97aaae9d1b56858a\": rpc error: code = NotFound desc = could not find container \"f3331e7b7d737a344c4e88af67d5b2da3e2d3928a3917aaf97aaae9d1b56858a\": container with ID starting with f3331e7b7d737a344c4e88af67d5b2da3e2d3928a3917aaf97aaae9d1b56858a not found: ID does not exist" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.518295 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-8gpx2"] Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.528752 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vwv82"] Oct 01 10:32:14 crc kubenswrapper[4735]: W1001 10:32:14.528905 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5dd8db9_b427_4510_b6ab_82883a128fa2.slice/crio-94bb02eafd8c96392c5590c484cdd80d0852022aee3f7de1f9a1f39851aab02d WatchSource:0}: Error finding container 94bb02eafd8c96392c5590c484cdd80d0852022aee3f7de1f9a1f39851aab02d: Status 404 returned error can't find the container with id 94bb02eafd8c96392c5590c484cdd80d0852022aee3f7de1f9a1f39851aab02d Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.534132 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vwv82"] Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.615348 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7mlp\" (UniqueName: \"kubernetes.io/projected/60de9ada-2ec6-4478-ac16-233568de0ba4-kube-api-access-z7mlp\") pod \"60de9ada-2ec6-4478-ac16-233568de0ba4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.615406 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-combined-ca-bundle\") pod \"60de9ada-2ec6-4478-ac16-233568de0ba4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.615543 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-swiftconf\") pod \"60de9ada-2ec6-4478-ac16-233568de0ba4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.615581 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60de9ada-2ec6-4478-ac16-233568de0ba4-etc-swift\") pod \"60de9ada-2ec6-4478-ac16-233568de0ba4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.615628 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-ring-data-devices\") pod \"60de9ada-2ec6-4478-ac16-233568de0ba4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.615659 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-scripts\") pod \"60de9ada-2ec6-4478-ac16-233568de0ba4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.615704 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-dispersionconf\") pod \"60de9ada-2ec6-4478-ac16-233568de0ba4\" (UID: \"60de9ada-2ec6-4478-ac16-233568de0ba4\") " Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.615994 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60de9ada-2ec6-4478-ac16-233568de0ba4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "60de9ada-2ec6-4478-ac16-233568de0ba4" (UID: "60de9ada-2ec6-4478-ac16-233568de0ba4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.616054 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "60de9ada-2ec6-4478-ac16-233568de0ba4" (UID: "60de9ada-2ec6-4478-ac16-233568de0ba4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.616609 4735 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60de9ada-2ec6-4478-ac16-233568de0ba4-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.616638 4735 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.617128 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-scripts" (OuterVolumeSpecName: "scripts") pod "60de9ada-2ec6-4478-ac16-233568de0ba4" (UID: "60de9ada-2ec6-4478-ac16-233568de0ba4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.620476 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "60de9ada-2ec6-4478-ac16-233568de0ba4" (UID: "60de9ada-2ec6-4478-ac16-233568de0ba4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.620586 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60de9ada-2ec6-4478-ac16-233568de0ba4" (UID: "60de9ada-2ec6-4478-ac16-233568de0ba4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.620697 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60de9ada-2ec6-4478-ac16-233568de0ba4-kube-api-access-z7mlp" (OuterVolumeSpecName: "kube-api-access-z7mlp") pod "60de9ada-2ec6-4478-ac16-233568de0ba4" (UID: "60de9ada-2ec6-4478-ac16-233568de0ba4"). InnerVolumeSpecName "kube-api-access-z7mlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.621097 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "60de9ada-2ec6-4478-ac16-233568de0ba4" (UID: "60de9ada-2ec6-4478-ac16-233568de0ba4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.718200 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60de9ada-2ec6-4478-ac16-233568de0ba4-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.718219 4735 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.718232 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7mlp\" (UniqueName: \"kubernetes.io/projected/60de9ada-2ec6-4478-ac16-233568de0ba4-kube-api-access-z7mlp\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.718240 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:14 crc kubenswrapper[4735]: I1001 10:32:14.718249 4735 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60de9ada-2ec6-4478-ac16-233568de0ba4-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:15 crc kubenswrapper[4735]: I1001 10:32:15.022777 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:15 crc kubenswrapper[4735]: E1001 10:32:15.022932 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 10:32:15 crc kubenswrapper[4735]: E1001 10:32:15.022949 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 10:32:15 crc kubenswrapper[4735]: E1001 10:32:15.023000 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift podName:651abe6c-1b2e-4652-a985-74f6cf2c7e17 nodeName:}" failed. No retries permitted until 2025-10-01 10:32:17.022985334 +0000 UTC m=+895.715806596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift") pod "swift-storage-0" (UID: "651abe6c-1b2e-4652-a985-74f6cf2c7e17") : configmap "swift-ring-files" not found Oct 01 10:32:15 crc kubenswrapper[4735]: I1001 10:32:15.461109 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8gpx2" event={"ID":"b5dd8db9-b427-4510-b6ab-82883a128fa2","Type":"ContainerStarted","Data":"94bb02eafd8c96392c5590c484cdd80d0852022aee3f7de1f9a1f39851aab02d"} Oct 01 10:32:15 crc kubenswrapper[4735]: I1001 10:32:15.464893 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-clfk4" Oct 01 10:32:15 crc kubenswrapper[4735]: I1001 10:32:15.520527 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-clfk4"] Oct 01 10:32:15 crc kubenswrapper[4735]: I1001 10:32:15.528585 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-clfk4"] Oct 01 10:32:15 crc kubenswrapper[4735]: I1001 10:32:15.908938 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60de9ada-2ec6-4478-ac16-233568de0ba4" path="/var/lib/kubelet/pods/60de9ada-2ec6-4478-ac16-233568de0ba4/volumes" Oct 01 10:32:15 crc kubenswrapper[4735]: I1001 10:32:15.909328 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd20020a-5a21-4558-91de-23f3b8d6278f" path="/var/lib/kubelet/pods/dd20020a-5a21-4558-91de-23f3b8d6278f/volumes" Oct 01 10:32:17 crc kubenswrapper[4735]: I1001 10:32:17.076558 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:17 crc kubenswrapper[4735]: E1001 10:32:17.077676 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 10:32:17 crc kubenswrapper[4735]: E1001 10:32:17.077704 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 10:32:17 crc kubenswrapper[4735]: E1001 10:32:17.077762 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift podName:651abe6c-1b2e-4652-a985-74f6cf2c7e17 nodeName:}" failed. No retries permitted until 2025-10-01 10:32:21.077743346 +0000 UTC m=+899.770564608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift") pod "swift-storage-0" (UID: "651abe6c-1b2e-4652-a985-74f6cf2c7e17") : configmap "swift-ring-files" not found Oct 01 10:32:18 crc kubenswrapper[4735]: I1001 10:32:18.487920 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8gpx2" event={"ID":"b5dd8db9-b427-4510-b6ab-82883a128fa2","Type":"ContainerStarted","Data":"a714c5af5c5d32549b2f304a099a2838c3e96aefc607f1b5d64ec98ba0e140f8"} Oct 01 10:32:18 crc kubenswrapper[4735]: I1001 10:32:18.513781 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-8gpx2" podStartSLOduration=1.80226599 podStartE2EDuration="5.513758903s" podCreationTimestamp="2025-10-01 10:32:13 +0000 UTC" firstStartedPulling="2025-10-01 10:32:14.53143322 +0000 UTC m=+893.224254482" lastFinishedPulling="2025-10-01 10:32:18.242926143 +0000 UTC m=+896.935747395" observedRunningTime="2025-10-01 10:32:18.506544536 +0000 UTC m=+897.199365828" watchObservedRunningTime="2025-10-01 10:32:18.513758903 +0000 UTC m=+897.206580175" Oct 01 10:32:19 crc kubenswrapper[4735]: I1001 10:32:19.337357 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 01 10:32:19 crc kubenswrapper[4735]: I1001 10:32:19.337423 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 01 10:32:19 crc kubenswrapper[4735]: I1001 10:32:19.399403 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 01 10:32:19 crc kubenswrapper[4735]: I1001 10:32:19.553993 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 01 10:32:19 crc kubenswrapper[4735]: I1001 10:32:19.758130 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 01 10:32:19 crc kubenswrapper[4735]: I1001 10:32:19.758442 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 01 10:32:19 crc kubenswrapper[4735]: I1001 10:32:19.797887 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.092551 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-4cn4h"] Oct 01 10:32:20 crc kubenswrapper[4735]: E1001 10:32:20.092861 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd20020a-5a21-4558-91de-23f3b8d6278f" containerName="init" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.092879 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd20020a-5a21-4558-91de-23f3b8d6278f" containerName="init" Oct 01 10:32:20 crc kubenswrapper[4735]: E1001 10:32:20.092895 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd20020a-5a21-4558-91de-23f3b8d6278f" containerName="dnsmasq-dns" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.092900 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd20020a-5a21-4558-91de-23f3b8d6278f" containerName="dnsmasq-dns" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.093078 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd20020a-5a21-4558-91de-23f3b8d6278f" containerName="dnsmasq-dns" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.093562 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cn4h" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.099907 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4cn4h"] Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.228083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcttt\" (UniqueName: \"kubernetes.io/projected/b26f86c5-90f8-43cb-b728-bcdc6eaac262-kube-api-access-mcttt\") pod \"placement-db-create-4cn4h\" (UID: \"b26f86c5-90f8-43cb-b728-bcdc6eaac262\") " pod="openstack/placement-db-create-4cn4h" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.297433 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-sdsqt"] Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.298387 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sdsqt" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.306646 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sdsqt"] Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.329849 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcttt\" (UniqueName: \"kubernetes.io/projected/b26f86c5-90f8-43cb-b728-bcdc6eaac262-kube-api-access-mcttt\") pod \"placement-db-create-4cn4h\" (UID: \"b26f86c5-90f8-43cb-b728-bcdc6eaac262\") " pod="openstack/placement-db-create-4cn4h" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.360809 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcttt\" (UniqueName: \"kubernetes.io/projected/b26f86c5-90f8-43cb-b728-bcdc6eaac262-kube-api-access-mcttt\") pod \"placement-db-create-4cn4h\" (UID: \"b26f86c5-90f8-43cb-b728-bcdc6eaac262\") " pod="openstack/placement-db-create-4cn4h" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.409054 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cn4h" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.431287 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch59l\" (UniqueName: \"kubernetes.io/projected/17b59340-1ad8-48a8-8521-c4b72ea14720-kube-api-access-ch59l\") pod \"glance-db-create-sdsqt\" (UID: \"17b59340-1ad8-48a8-8521-c4b72ea14720\") " pod="openstack/glance-db-create-sdsqt" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.532372 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch59l\" (UniqueName: \"kubernetes.io/projected/17b59340-1ad8-48a8-8521-c4b72ea14720-kube-api-access-ch59l\") pod \"glance-db-create-sdsqt\" (UID: \"17b59340-1ad8-48a8-8521-c4b72ea14720\") " pod="openstack/glance-db-create-sdsqt" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.549455 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch59l\" (UniqueName: \"kubernetes.io/projected/17b59340-1ad8-48a8-8521-c4b72ea14720-kube-api-access-ch59l\") pod \"glance-db-create-sdsqt\" (UID: \"17b59340-1ad8-48a8-8521-c4b72ea14720\") " pod="openstack/glance-db-create-sdsqt" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.580545 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.615462 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sdsqt" Oct 01 10:32:20 crc kubenswrapper[4735]: I1001 10:32:20.825505 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4cn4h"] Oct 01 10:32:20 crc kubenswrapper[4735]: W1001 10:32:20.829132 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb26f86c5_90f8_43cb_b728_bcdc6eaac262.slice/crio-af9b3e07c51a3cd50916eb5b604f148dd91c4632302bc7153b1134c9fa59bc7b WatchSource:0}: Error finding container af9b3e07c51a3cd50916eb5b604f148dd91c4632302bc7153b1134c9fa59bc7b: Status 404 returned error can't find the container with id af9b3e07c51a3cd50916eb5b604f148dd91c4632302bc7153b1134c9fa59bc7b Oct 01 10:32:21 crc kubenswrapper[4735]: I1001 10:32:21.040345 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sdsqt"] Oct 01 10:32:21 crc kubenswrapper[4735]: W1001 10:32:21.061555 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17b59340_1ad8_48a8_8521_c4b72ea14720.slice/crio-6f413861139b4a393a8af740f5134dfbda65b89a4b3abec651d9f1528fa47787 WatchSource:0}: Error finding container 6f413861139b4a393a8af740f5134dfbda65b89a4b3abec651d9f1528fa47787: Status 404 returned error can't find the container with id 6f413861139b4a393a8af740f5134dfbda65b89a4b3abec651d9f1528fa47787 Oct 01 10:32:21 crc kubenswrapper[4735]: I1001 10:32:21.142841 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:21 crc kubenswrapper[4735]: E1001 10:32:21.143057 4735 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 10:32:21 crc kubenswrapper[4735]: E1001 10:32:21.143078 4735 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 10:32:21 crc kubenswrapper[4735]: E1001 10:32:21.143141 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift podName:651abe6c-1b2e-4652-a985-74f6cf2c7e17 nodeName:}" failed. No retries permitted until 2025-10-01 10:32:29.143120296 +0000 UTC m=+907.835941558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift") pod "swift-storage-0" (UID: "651abe6c-1b2e-4652-a985-74f6cf2c7e17") : configmap "swift-ring-files" not found Oct 01 10:32:21 crc kubenswrapper[4735]: I1001 10:32:21.360638 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:21 crc kubenswrapper[4735]: I1001 10:32:21.527063 4735 generic.go:334] "Generic (PLEG): container finished" podID="b26f86c5-90f8-43cb-b728-bcdc6eaac262" containerID="3e8fb92337e9a80bca506949f8d810eb0d38642e2f367e9f28c432281bbbdba3" exitCode=0 Oct 01 10:32:21 crc kubenswrapper[4735]: I1001 10:32:21.527157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4cn4h" event={"ID":"b26f86c5-90f8-43cb-b728-bcdc6eaac262","Type":"ContainerDied","Data":"3e8fb92337e9a80bca506949f8d810eb0d38642e2f367e9f28c432281bbbdba3"} Oct 01 10:32:21 crc kubenswrapper[4735]: I1001 10:32:21.527457 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4cn4h" event={"ID":"b26f86c5-90f8-43cb-b728-bcdc6eaac262","Type":"ContainerStarted","Data":"af9b3e07c51a3cd50916eb5b604f148dd91c4632302bc7153b1134c9fa59bc7b"} Oct 01 10:32:21 crc kubenswrapper[4735]: I1001 10:32:21.530644 4735 generic.go:334] "Generic (PLEG): container finished" podID="17b59340-1ad8-48a8-8521-c4b72ea14720" containerID="b2c96490c70328115f0b86561f65ea61606b977bcc4ec9286815be1d297dcc5c" exitCode=0 Oct 01 10:32:21 crc kubenswrapper[4735]: I1001 10:32:21.530786 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sdsqt" event={"ID":"17b59340-1ad8-48a8-8521-c4b72ea14720","Type":"ContainerDied","Data":"b2c96490c70328115f0b86561f65ea61606b977bcc4ec9286815be1d297dcc5c"} Oct 01 10:32:21 crc kubenswrapper[4735]: I1001 10:32:21.530844 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sdsqt" event={"ID":"17b59340-1ad8-48a8-8521-c4b72ea14720","Type":"ContainerStarted","Data":"6f413861139b4a393a8af740f5134dfbda65b89a4b3abec651d9f1528fa47787"} Oct 01 10:32:22 crc kubenswrapper[4735]: I1001 10:32:22.490679 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:22 crc kubenswrapper[4735]: I1001 10:32:22.554409 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wn8jw"] Oct 01 10:32:22 crc kubenswrapper[4735]: I1001 10:32:22.554722 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" podUID="ed5ea827-1e94-4de0-ba76-958a200fb185" containerName="dnsmasq-dns" containerID="cri-o://4263a049dc0db4c56b43cde5d2ae9e531c8b37b864d7c5b9864aceb58d9f3dc4" gracePeriod=10 Oct 01 10:32:22 crc kubenswrapper[4735]: I1001 10:32:22.926855 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sdsqt" Oct 01 10:32:22 crc kubenswrapper[4735]: I1001 10:32:22.933832 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cn4h" Oct 01 10:32:22 crc kubenswrapper[4735]: I1001 10:32:22.988529 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch59l\" (UniqueName: \"kubernetes.io/projected/17b59340-1ad8-48a8-8521-c4b72ea14720-kube-api-access-ch59l\") pod \"17b59340-1ad8-48a8-8521-c4b72ea14720\" (UID: \"17b59340-1ad8-48a8-8521-c4b72ea14720\") " Oct 01 10:32:22 crc kubenswrapper[4735]: I1001 10:32:22.993964 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b59340-1ad8-48a8-8521-c4b72ea14720-kube-api-access-ch59l" (OuterVolumeSpecName: "kube-api-access-ch59l") pod "17b59340-1ad8-48a8-8521-c4b72ea14720" (UID: "17b59340-1ad8-48a8-8521-c4b72ea14720"). InnerVolumeSpecName "kube-api-access-ch59l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.090459 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcttt\" (UniqueName: \"kubernetes.io/projected/b26f86c5-90f8-43cb-b728-bcdc6eaac262-kube-api-access-mcttt\") pod \"b26f86c5-90f8-43cb-b728-bcdc6eaac262\" (UID: \"b26f86c5-90f8-43cb-b728-bcdc6eaac262\") " Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.091182 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch59l\" (UniqueName: \"kubernetes.io/projected/17b59340-1ad8-48a8-8521-c4b72ea14720-kube-api-access-ch59l\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.094197 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b26f86c5-90f8-43cb-b728-bcdc6eaac262-kube-api-access-mcttt" (OuterVolumeSpecName: "kube-api-access-mcttt") pod "b26f86c5-90f8-43cb-b728-bcdc6eaac262" (UID: "b26f86c5-90f8-43cb-b728-bcdc6eaac262"). InnerVolumeSpecName "kube-api-access-mcttt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.193103 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcttt\" (UniqueName: \"kubernetes.io/projected/b26f86c5-90f8-43cb-b728-bcdc6eaac262-kube-api-access-mcttt\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.555539 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4cn4h" event={"ID":"b26f86c5-90f8-43cb-b728-bcdc6eaac262","Type":"ContainerDied","Data":"af9b3e07c51a3cd50916eb5b604f148dd91c4632302bc7153b1134c9fa59bc7b"} Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.555582 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af9b3e07c51a3cd50916eb5b604f148dd91c4632302bc7153b1134c9fa59bc7b" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.555557 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cn4h" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.556765 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sdsqt" event={"ID":"17b59340-1ad8-48a8-8521-c4b72ea14720","Type":"ContainerDied","Data":"6f413861139b4a393a8af740f5134dfbda65b89a4b3abec651d9f1528fa47787"} Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.556793 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f413861139b4a393a8af740f5134dfbda65b89a4b3abec651d9f1528fa47787" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.556834 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sdsqt" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.559370 4735 generic.go:334] "Generic (PLEG): container finished" podID="ed5ea827-1e94-4de0-ba76-958a200fb185" containerID="4263a049dc0db4c56b43cde5d2ae9e531c8b37b864d7c5b9864aceb58d9f3dc4" exitCode=0 Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.559422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" event={"ID":"ed5ea827-1e94-4de0-ba76-958a200fb185","Type":"ContainerDied","Data":"4263a049dc0db4c56b43cde5d2ae9e531c8b37b864d7c5b9864aceb58d9f3dc4"} Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.750878 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.910388 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-sb\") pod \"ed5ea827-1e94-4de0-ba76-958a200fb185\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.911054 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-nb\") pod \"ed5ea827-1e94-4de0-ba76-958a200fb185\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.911185 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-dns-svc\") pod \"ed5ea827-1e94-4de0-ba76-958a200fb185\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.911275 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lxvm\" (UniqueName: \"kubernetes.io/projected/ed5ea827-1e94-4de0-ba76-958a200fb185-kube-api-access-4lxvm\") pod \"ed5ea827-1e94-4de0-ba76-958a200fb185\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.911379 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-config\") pod \"ed5ea827-1e94-4de0-ba76-958a200fb185\" (UID: \"ed5ea827-1e94-4de0-ba76-958a200fb185\") " Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.917028 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed5ea827-1e94-4de0-ba76-958a200fb185-kube-api-access-4lxvm" (OuterVolumeSpecName: "kube-api-access-4lxvm") pod "ed5ea827-1e94-4de0-ba76-958a200fb185" (UID: "ed5ea827-1e94-4de0-ba76-958a200fb185"). InnerVolumeSpecName "kube-api-access-4lxvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.959040 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed5ea827-1e94-4de0-ba76-958a200fb185" (UID: "ed5ea827-1e94-4de0-ba76-958a200fb185"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.965879 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed5ea827-1e94-4de0-ba76-958a200fb185" (UID: "ed5ea827-1e94-4de0-ba76-958a200fb185"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.975365 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-config" (OuterVolumeSpecName: "config") pod "ed5ea827-1e94-4de0-ba76-958a200fb185" (UID: "ed5ea827-1e94-4de0-ba76-958a200fb185"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:23 crc kubenswrapper[4735]: I1001 10:32:23.998775 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed5ea827-1e94-4de0-ba76-958a200fb185" (UID: "ed5ea827-1e94-4de0-ba76-958a200fb185"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:24 crc kubenswrapper[4735]: I1001 10:32:24.014146 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:24 crc kubenswrapper[4735]: I1001 10:32:24.014184 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:24 crc kubenswrapper[4735]: I1001 10:32:24.014197 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lxvm\" (UniqueName: \"kubernetes.io/projected/ed5ea827-1e94-4de0-ba76-958a200fb185-kube-api-access-4lxvm\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:24 crc kubenswrapper[4735]: I1001 10:32:24.014213 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:24 crc kubenswrapper[4735]: I1001 10:32:24.014224 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed5ea827-1e94-4de0-ba76-958a200fb185-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:24 crc kubenswrapper[4735]: I1001 10:32:24.567697 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" event={"ID":"ed5ea827-1e94-4de0-ba76-958a200fb185","Type":"ContainerDied","Data":"d94df195d73de0766dcaa45b046599fca0247d7fc4f59535151b4b3a8122f7fa"} Oct 01 10:32:24 crc kubenswrapper[4735]: I1001 10:32:24.567753 4735 scope.go:117] "RemoveContainer" containerID="4263a049dc0db4c56b43cde5d2ae9e531c8b37b864d7c5b9864aceb58d9f3dc4" Oct 01 10:32:24 crc kubenswrapper[4735]: I1001 10:32:24.567768 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-wn8jw" Oct 01 10:32:24 crc kubenswrapper[4735]: I1001 10:32:24.595714 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wn8jw"] Oct 01 10:32:24 crc kubenswrapper[4735]: I1001 10:32:24.597205 4735 scope.go:117] "RemoveContainer" containerID="bfb064342e3579a661318b2308b767f8f14fa83c4dba82c5fb13dabb5d813280" Oct 01 10:32:24 crc kubenswrapper[4735]: I1001 10:32:24.600846 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-wn8jw"] Oct 01 10:32:25 crc kubenswrapper[4735]: I1001 10:32:25.578218 4735 generic.go:334] "Generic (PLEG): container finished" podID="b5dd8db9-b427-4510-b6ab-82883a128fa2" containerID="a714c5af5c5d32549b2f304a099a2838c3e96aefc607f1b5d64ec98ba0e140f8" exitCode=0 Oct 01 10:32:25 crc kubenswrapper[4735]: I1001 10:32:25.578328 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8gpx2" event={"ID":"b5dd8db9-b427-4510-b6ab-82883a128fa2","Type":"ContainerDied","Data":"a714c5af5c5d32549b2f304a099a2838c3e96aefc607f1b5d64ec98ba0e140f8"} Oct 01 10:32:25 crc kubenswrapper[4735]: I1001 10:32:25.909604 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed5ea827-1e94-4de0-ba76-958a200fb185" path="/var/lib/kubelet/pods/ed5ea827-1e94-4de0-ba76-958a200fb185/volumes" Oct 01 10:32:26 crc kubenswrapper[4735]: I1001 10:32:26.912457 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:26 crc kubenswrapper[4735]: I1001 10:32:26.972180 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.061835 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwdqz\" (UniqueName: \"kubernetes.io/projected/b5dd8db9-b427-4510-b6ab-82883a128fa2-kube-api-access-zwdqz\") pod \"b5dd8db9-b427-4510-b6ab-82883a128fa2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.061919 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-scripts\") pod \"b5dd8db9-b427-4510-b6ab-82883a128fa2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.061952 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-combined-ca-bundle\") pod \"b5dd8db9-b427-4510-b6ab-82883a128fa2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.061979 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5dd8db9-b427-4510-b6ab-82883a128fa2-etc-swift\") pod \"b5dd8db9-b427-4510-b6ab-82883a128fa2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.062045 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-swiftconf\") pod \"b5dd8db9-b427-4510-b6ab-82883a128fa2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.062075 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-ring-data-devices\") pod \"b5dd8db9-b427-4510-b6ab-82883a128fa2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.062115 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-dispersionconf\") pod \"b5dd8db9-b427-4510-b6ab-82883a128fa2\" (UID: \"b5dd8db9-b427-4510-b6ab-82883a128fa2\") " Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.063421 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b5dd8db9-b427-4510-b6ab-82883a128fa2" (UID: "b5dd8db9-b427-4510-b6ab-82883a128fa2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.064204 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5dd8db9-b427-4510-b6ab-82883a128fa2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b5dd8db9-b427-4510-b6ab-82883a128fa2" (UID: "b5dd8db9-b427-4510-b6ab-82883a128fa2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.068225 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5dd8db9-b427-4510-b6ab-82883a128fa2-kube-api-access-zwdqz" (OuterVolumeSpecName: "kube-api-access-zwdqz") pod "b5dd8db9-b427-4510-b6ab-82883a128fa2" (UID: "b5dd8db9-b427-4510-b6ab-82883a128fa2"). InnerVolumeSpecName "kube-api-access-zwdqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.071165 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b5dd8db9-b427-4510-b6ab-82883a128fa2" (UID: "b5dd8db9-b427-4510-b6ab-82883a128fa2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.084035 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-scripts" (OuterVolumeSpecName: "scripts") pod "b5dd8db9-b427-4510-b6ab-82883a128fa2" (UID: "b5dd8db9-b427-4510-b6ab-82883a128fa2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.091055 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b5dd8db9-b427-4510-b6ab-82883a128fa2" (UID: "b5dd8db9-b427-4510-b6ab-82883a128fa2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.101227 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5dd8db9-b427-4510-b6ab-82883a128fa2" (UID: "b5dd8db9-b427-4510-b6ab-82883a128fa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.165218 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwdqz\" (UniqueName: \"kubernetes.io/projected/b5dd8db9-b427-4510-b6ab-82883a128fa2-kube-api-access-zwdqz\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.165561 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.165665 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.165744 4735 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5dd8db9-b427-4510-b6ab-82883a128fa2-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.165860 4735 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.165972 4735 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5dd8db9-b427-4510-b6ab-82883a128fa2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.166097 4735 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5dd8db9-b427-4510-b6ab-82883a128fa2-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.593893 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8gpx2" event={"ID":"b5dd8db9-b427-4510-b6ab-82883a128fa2","Type":"ContainerDied","Data":"94bb02eafd8c96392c5590c484cdd80d0852022aee3f7de1f9a1f39851aab02d"} Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.594260 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94bb02eafd8c96392c5590c484cdd80d0852022aee3f7de1f9a1f39851aab02d" Oct 01 10:32:27 crc kubenswrapper[4735]: I1001 10:32:27.594016 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8gpx2" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.201910 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.209129 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/651abe6c-1b2e-4652-a985-74f6cf2c7e17-etc-swift\") pod \"swift-storage-0\" (UID: \"651abe6c-1b2e-4652-a985-74f6cf2c7e17\") " pod="openstack/swift-storage-0" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.433709 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.612944 4735 generic.go:334] "Generic (PLEG): container finished" podID="0be1b363-c0e5-4c73-9359-00032a6c8ab9" containerID="dd7a7fe9aeac4b441d9f64f6405e80c705fceb5363e7a65d51c92409606fc6d2" exitCode=0 Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.613011 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0be1b363-c0e5-4c73-9359-00032a6c8ab9","Type":"ContainerDied","Data":"dd7a7fe9aeac4b441d9f64f6405e80c705fceb5363e7a65d51c92409606fc6d2"} Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.645196 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-rx5s2"] Oct 01 10:32:29 crc kubenswrapper[4735]: E1001 10:32:29.645946 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed5ea827-1e94-4de0-ba76-958a200fb185" containerName="dnsmasq-dns" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.645967 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed5ea827-1e94-4de0-ba76-958a200fb185" containerName="dnsmasq-dns" Oct 01 10:32:29 crc kubenswrapper[4735]: E1001 10:32:29.645984 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5dd8db9-b427-4510-b6ab-82883a128fa2" containerName="swift-ring-rebalance" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.645992 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5dd8db9-b427-4510-b6ab-82883a128fa2" containerName="swift-ring-rebalance" Oct 01 10:32:29 crc kubenswrapper[4735]: E1001 10:32:29.646007 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed5ea827-1e94-4de0-ba76-958a200fb185" containerName="init" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.646016 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed5ea827-1e94-4de0-ba76-958a200fb185" containerName="init" Oct 01 10:32:29 crc kubenswrapper[4735]: E1001 10:32:29.646030 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26f86c5-90f8-43cb-b728-bcdc6eaac262" containerName="mariadb-database-create" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.646038 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26f86c5-90f8-43cb-b728-bcdc6eaac262" containerName="mariadb-database-create" Oct 01 10:32:29 crc kubenswrapper[4735]: E1001 10:32:29.646055 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b59340-1ad8-48a8-8521-c4b72ea14720" containerName="mariadb-database-create" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.646062 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b59340-1ad8-48a8-8521-c4b72ea14720" containerName="mariadb-database-create" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.646239 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5dd8db9-b427-4510-b6ab-82883a128fa2" containerName="swift-ring-rebalance" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.646291 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26f86c5-90f8-43cb-b728-bcdc6eaac262" containerName="mariadb-database-create" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.646316 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed5ea827-1e94-4de0-ba76-958a200fb185" containerName="dnsmasq-dns" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.646327 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b59340-1ad8-48a8-8521-c4b72ea14720" containerName="mariadb-database-create" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.647004 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rx5s2" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.669211 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rx5s2"] Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.814675 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjkqf\" (UniqueName: \"kubernetes.io/projected/d6bf79da-946c-4805-9ba3-0b58e969b33a-kube-api-access-hjkqf\") pod \"keystone-db-create-rx5s2\" (UID: \"d6bf79da-946c-4805-9ba3-0b58e969b33a\") " pod="openstack/keystone-db-create-rx5s2" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.916796 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjkqf\" (UniqueName: \"kubernetes.io/projected/d6bf79da-946c-4805-9ba3-0b58e969b33a-kube-api-access-hjkqf\") pod \"keystone-db-create-rx5s2\" (UID: \"d6bf79da-946c-4805-9ba3-0b58e969b33a\") " pod="openstack/keystone-db-create-rx5s2" Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.940367 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 10:32:29 crc kubenswrapper[4735]: I1001 10:32:29.947965 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjkqf\" (UniqueName: \"kubernetes.io/projected/d6bf79da-946c-4805-9ba3-0b58e969b33a-kube-api-access-hjkqf\") pod \"keystone-db-create-rx5s2\" (UID: \"d6bf79da-946c-4805-9ba3-0b58e969b33a\") " pod="openstack/keystone-db-create-rx5s2" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.062214 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rx5s2" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.106005 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-fb4d-account-create-wt5pb"] Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.106933 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fb4d-account-create-wt5pb" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.109325 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.119095 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fb4d-account-create-wt5pb"] Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.222543 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9fsp\" (UniqueName: \"kubernetes.io/projected/11967ebb-ac0f-4d46-adb0-b100ee29528b-kube-api-access-z9fsp\") pod \"placement-fb4d-account-create-wt5pb\" (UID: \"11967ebb-ac0f-4d46-adb0-b100ee29528b\") " pod="openstack/placement-fb4d-account-create-wt5pb" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.323829 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9fsp\" (UniqueName: \"kubernetes.io/projected/11967ebb-ac0f-4d46-adb0-b100ee29528b-kube-api-access-z9fsp\") pod \"placement-fb4d-account-create-wt5pb\" (UID: \"11967ebb-ac0f-4d46-adb0-b100ee29528b\") " pod="openstack/placement-fb4d-account-create-wt5pb" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.340445 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9fsp\" (UniqueName: \"kubernetes.io/projected/11967ebb-ac0f-4d46-adb0-b100ee29528b-kube-api-access-z9fsp\") pod \"placement-fb4d-account-create-wt5pb\" (UID: \"11967ebb-ac0f-4d46-adb0-b100ee29528b\") " pod="openstack/placement-fb4d-account-create-wt5pb" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.418129 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4441-account-create-hgh8b"] Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.419366 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4441-account-create-hgh8b" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.421219 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.423972 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4441-account-create-hgh8b"] Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.425986 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fb4d-account-create-wt5pb" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.482846 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rx5s2"] Oct 01 10:32:30 crc kubenswrapper[4735]: W1001 10:32:30.488509 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6bf79da_946c_4805_9ba3_0b58e969b33a.slice/crio-cf5892c4786a40adb494dbcd23069ce53fad41600a6fab59c55d88ffb10b42cd WatchSource:0}: Error finding container cf5892c4786a40adb494dbcd23069ce53fad41600a6fab59c55d88ffb10b42cd: Status 404 returned error can't find the container with id cf5892c4786a40adb494dbcd23069ce53fad41600a6fab59c55d88ffb10b42cd Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.526411 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjbs2\" (UniqueName: \"kubernetes.io/projected/254c94ca-4d51-4559-b88f-5363ff751d36-kube-api-access-wjbs2\") pod \"glance-4441-account-create-hgh8b\" (UID: \"254c94ca-4d51-4559-b88f-5363ff751d36\") " pod="openstack/glance-4441-account-create-hgh8b" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.623825 4735 generic.go:334] "Generic (PLEG): container finished" podID="f74d6671-f7b0-46ae-91d4-ddb09a530249" containerID="99e7412a96e156ecb4e75f45624255a2002f318654941bab406bfe4f4add0f39" exitCode=0 Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.624019 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74d6671-f7b0-46ae-91d4-ddb09a530249","Type":"ContainerDied","Data":"99e7412a96e156ecb4e75f45624255a2002f318654941bab406bfe4f4add0f39"} Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.626082 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0be1b363-c0e5-4c73-9359-00032a6c8ab9","Type":"ContainerStarted","Data":"2057b4715aa35c68a82b85bf9b189bc9f2536a5e6a4a8fb0ef7f114940fa42e5"} Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.626574 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.628288 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjbs2\" (UniqueName: \"kubernetes.io/projected/254c94ca-4d51-4559-b88f-5363ff751d36-kube-api-access-wjbs2\") pod \"glance-4441-account-create-hgh8b\" (UID: \"254c94ca-4d51-4559-b88f-5363ff751d36\") " pod="openstack/glance-4441-account-create-hgh8b" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.629691 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"da3d6af2d5440227d2b906f24366dfc2ce60bdc541bbb8b8288181afa84a355e"} Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.633130 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rx5s2" event={"ID":"d6bf79da-946c-4805-9ba3-0b58e969b33a","Type":"ContainerStarted","Data":"cf5892c4786a40adb494dbcd23069ce53fad41600a6fab59c55d88ffb10b42cd"} Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.649774 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjbs2\" (UniqueName: \"kubernetes.io/projected/254c94ca-4d51-4559-b88f-5363ff751d36-kube-api-access-wjbs2\") pod \"glance-4441-account-create-hgh8b\" (UID: \"254c94ca-4d51-4559-b88f-5363ff751d36\") " pod="openstack/glance-4441-account-create-hgh8b" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.673414 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.647893454 podStartE2EDuration="55.673395478s" podCreationTimestamp="2025-10-01 10:31:35 +0000 UTC" firstStartedPulling="2025-10-01 10:31:37.203290308 +0000 UTC m=+855.896111570" lastFinishedPulling="2025-10-01 10:31:54.228792332 +0000 UTC m=+872.921613594" observedRunningTime="2025-10-01 10:32:30.671215688 +0000 UTC m=+909.364036950" watchObservedRunningTime="2025-10-01 10:32:30.673395478 +0000 UTC m=+909.366216740" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.739911 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4441-account-create-hgh8b" Oct 01 10:32:30 crc kubenswrapper[4735]: I1001 10:32:30.844378 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-fb4d-account-create-wt5pb"] Oct 01 10:32:30 crc kubenswrapper[4735]: W1001 10:32:30.942182 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11967ebb_ac0f_4d46_adb0_b100ee29528b.slice/crio-e0d5ff988b339efb5f9f4af9de681d5d698927261cf0168bde457b071586d525 WatchSource:0}: Error finding container e0d5ff988b339efb5f9f4af9de681d5d698927261cf0168bde457b071586d525: Status 404 returned error can't find the container with id e0d5ff988b339efb5f9f4af9de681d5d698927261cf0168bde457b071586d525 Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.362627 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4441-account-create-hgh8b"] Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.644158 4735 generic.go:334] "Generic (PLEG): container finished" podID="d6bf79da-946c-4805-9ba3-0b58e969b33a" containerID="98568aad39b8f4d7cc86870dd3a4b1084add1343cd024e8f4ad7cb41052875b3" exitCode=0 Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.644218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rx5s2" event={"ID":"d6bf79da-946c-4805-9ba3-0b58e969b33a","Type":"ContainerDied","Data":"98568aad39b8f4d7cc86870dd3a4b1084add1343cd024e8f4ad7cb41052875b3"} Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.645223 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4441-account-create-hgh8b" event={"ID":"254c94ca-4d51-4559-b88f-5363ff751d36","Type":"ContainerStarted","Data":"3cbe5988d296450eab37bf2ab84072dadfdb357254fe7e47cc8b463b684e6d52"} Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.645246 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4441-account-create-hgh8b" event={"ID":"254c94ca-4d51-4559-b88f-5363ff751d36","Type":"ContainerStarted","Data":"9ead1f170654559709a9d0d1830f0f6f4707cf4bbf6c62a29b59626321982eec"} Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.646262 4735 generic.go:334] "Generic (PLEG): container finished" podID="11967ebb-ac0f-4d46-adb0-b100ee29528b" containerID="be5d117abefe9e28d4ed58fc95f5ace4de3a7aa6e3500377022e8b00f3f68d0d" exitCode=0 Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.646297 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fb4d-account-create-wt5pb" event={"ID":"11967ebb-ac0f-4d46-adb0-b100ee29528b","Type":"ContainerDied","Data":"be5d117abefe9e28d4ed58fc95f5ace4de3a7aa6e3500377022e8b00f3f68d0d"} Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.646313 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fb4d-account-create-wt5pb" event={"ID":"11967ebb-ac0f-4d46-adb0-b100ee29528b","Type":"ContainerStarted","Data":"e0d5ff988b339efb5f9f4af9de681d5d698927261cf0168bde457b071586d525"} Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.650095 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74d6671-f7b0-46ae-91d4-ddb09a530249","Type":"ContainerStarted","Data":"6948c7ada5c15da2f6d84ce37854bb9e6c999f4068d03ee0404a038e49962128"} Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.650312 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.653605 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"dc995586fb42eee157ed9da41efdac0cb5013aacb67c8392467f0d0e43af513a"} Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.653652 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"8e512985c6c069eb9bfd862ab347b8f918338508b3c7c2ee34b2bdc531ff3357"} Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.653667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"7d512b597f345e6aa4b3ec9a70306e9ead7ab1ff4678ea68429dd2a6fd5b923b"} Oct 01 10:32:31 crc kubenswrapper[4735]: I1001 10:32:31.692466 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.321073849 podStartE2EDuration="56.692446509s" podCreationTimestamp="2025-10-01 10:31:35 +0000 UTC" firstStartedPulling="2025-10-01 10:31:36.936301242 +0000 UTC m=+855.629122504" lastFinishedPulling="2025-10-01 10:31:56.307673902 +0000 UTC m=+875.000495164" observedRunningTime="2025-10-01 10:32:31.686924912 +0000 UTC m=+910.379746174" watchObservedRunningTime="2025-10-01 10:32:31.692446509 +0000 UTC m=+910.385267771" Oct 01 10:32:32 crc kubenswrapper[4735]: I1001 10:32:32.662809 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"a126e59e191a476d2df49892c786a770bf9498d4edcad641794d3530c4eb1327"} Oct 01 10:32:32 crc kubenswrapper[4735]: I1001 10:32:32.664290 4735 generic.go:334] "Generic (PLEG): container finished" podID="254c94ca-4d51-4559-b88f-5363ff751d36" containerID="3cbe5988d296450eab37bf2ab84072dadfdb357254fe7e47cc8b463b684e6d52" exitCode=0 Oct 01 10:32:32 crc kubenswrapper[4735]: I1001 10:32:32.665036 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4441-account-create-hgh8b" event={"ID":"254c94ca-4d51-4559-b88f-5363ff751d36","Type":"ContainerDied","Data":"3cbe5988d296450eab37bf2ab84072dadfdb357254fe7e47cc8b463b684e6d52"} Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.087191 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rx5s2" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.095465 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4441-account-create-hgh8b" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.102040 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fb4d-account-create-wt5pb" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.171958 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9fsp\" (UniqueName: \"kubernetes.io/projected/11967ebb-ac0f-4d46-adb0-b100ee29528b-kube-api-access-z9fsp\") pod \"11967ebb-ac0f-4d46-adb0-b100ee29528b\" (UID: \"11967ebb-ac0f-4d46-adb0-b100ee29528b\") " Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.172149 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjbs2\" (UniqueName: \"kubernetes.io/projected/254c94ca-4d51-4559-b88f-5363ff751d36-kube-api-access-wjbs2\") pod \"254c94ca-4d51-4559-b88f-5363ff751d36\" (UID: \"254c94ca-4d51-4559-b88f-5363ff751d36\") " Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.172221 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjkqf\" (UniqueName: \"kubernetes.io/projected/d6bf79da-946c-4805-9ba3-0b58e969b33a-kube-api-access-hjkqf\") pod \"d6bf79da-946c-4805-9ba3-0b58e969b33a\" (UID: \"d6bf79da-946c-4805-9ba3-0b58e969b33a\") " Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.177097 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6bf79da-946c-4805-9ba3-0b58e969b33a-kube-api-access-hjkqf" (OuterVolumeSpecName: "kube-api-access-hjkqf") pod "d6bf79da-946c-4805-9ba3-0b58e969b33a" (UID: "d6bf79da-946c-4805-9ba3-0b58e969b33a"). InnerVolumeSpecName "kube-api-access-hjkqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.177419 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11967ebb-ac0f-4d46-adb0-b100ee29528b-kube-api-access-z9fsp" (OuterVolumeSpecName: "kube-api-access-z9fsp") pod "11967ebb-ac0f-4d46-adb0-b100ee29528b" (UID: "11967ebb-ac0f-4d46-adb0-b100ee29528b"). InnerVolumeSpecName "kube-api-access-z9fsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.177480 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/254c94ca-4d51-4559-b88f-5363ff751d36-kube-api-access-wjbs2" (OuterVolumeSpecName: "kube-api-access-wjbs2") pod "254c94ca-4d51-4559-b88f-5363ff751d36" (UID: "254c94ca-4d51-4559-b88f-5363ff751d36"). InnerVolumeSpecName "kube-api-access-wjbs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.274345 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjbs2\" (UniqueName: \"kubernetes.io/projected/254c94ca-4d51-4559-b88f-5363ff751d36-kube-api-access-wjbs2\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.274385 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjkqf\" (UniqueName: \"kubernetes.io/projected/d6bf79da-946c-4805-9ba3-0b58e969b33a-kube-api-access-hjkqf\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.274394 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9fsp\" (UniqueName: \"kubernetes.io/projected/11967ebb-ac0f-4d46-adb0-b100ee29528b-kube-api-access-z9fsp\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.677422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rx5s2" event={"ID":"d6bf79da-946c-4805-9ba3-0b58e969b33a","Type":"ContainerDied","Data":"cf5892c4786a40adb494dbcd23069ce53fad41600a6fab59c55d88ffb10b42cd"} Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.677452 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rx5s2" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.677460 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf5892c4786a40adb494dbcd23069ce53fad41600a6fab59c55d88ffb10b42cd" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.683442 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4441-account-create-hgh8b" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.683514 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4441-account-create-hgh8b" event={"ID":"254c94ca-4d51-4559-b88f-5363ff751d36","Type":"ContainerDied","Data":"9ead1f170654559709a9d0d1830f0f6f4707cf4bbf6c62a29b59626321982eec"} Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.683541 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ead1f170654559709a9d0d1830f0f6f4707cf4bbf6c62a29b59626321982eec" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.686071 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-fb4d-account-create-wt5pb" event={"ID":"11967ebb-ac0f-4d46-adb0-b100ee29528b","Type":"ContainerDied","Data":"e0d5ff988b339efb5f9f4af9de681d5d698927261cf0168bde457b071586d525"} Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.686131 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0d5ff988b339efb5f9f4af9de681d5d698927261cf0168bde457b071586d525" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.686077 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-fb4d-account-create-wt5pb" Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.693416 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"f698f7245bc9cc0f470590a5e023a9b7f3a14bb341d36b78a7fd8fd95075da9c"} Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.693546 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"c953d7e606b12d5fedc59f8eb0dcc4ce2d7033d366ca5a367e89fbe764e8e426"} Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.693582 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"99c7c0a8dc2a6ff03b3ac77dc5930ccb1bbdc0de997381a295629dca82df924d"} Oct 01 10:32:33 crc kubenswrapper[4735]: I1001 10:32:33.693607 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"910bb770d4456335cf87d7d7b1e6878a0f86d34ec4a7612cf27074bd4d07f407"} Oct 01 10:32:34 crc kubenswrapper[4735]: I1001 10:32:34.705555 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"4aadb701048e58f5c5fb5c20b7ef6e29e7fbebd16aa4d13233bae65f9a5acdc9"} Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.544174 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-qksn5"] Oct 01 10:32:35 crc kubenswrapper[4735]: E1001 10:32:35.544693 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254c94ca-4d51-4559-b88f-5363ff751d36" containerName="mariadb-account-create" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.544708 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="254c94ca-4d51-4559-b88f-5363ff751d36" containerName="mariadb-account-create" Oct 01 10:32:35 crc kubenswrapper[4735]: E1001 10:32:35.544718 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11967ebb-ac0f-4d46-adb0-b100ee29528b" containerName="mariadb-account-create" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.544724 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="11967ebb-ac0f-4d46-adb0-b100ee29528b" containerName="mariadb-account-create" Oct 01 10:32:35 crc kubenswrapper[4735]: E1001 10:32:35.544733 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6bf79da-946c-4805-9ba3-0b58e969b33a" containerName="mariadb-database-create" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.544740 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6bf79da-946c-4805-9ba3-0b58e969b33a" containerName="mariadb-database-create" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.544903 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="254c94ca-4d51-4559-b88f-5363ff751d36" containerName="mariadb-account-create" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.544938 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6bf79da-946c-4805-9ba3-0b58e969b33a" containerName="mariadb-database-create" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.544954 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="11967ebb-ac0f-4d46-adb0-b100ee29528b" containerName="mariadb-account-create" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.545397 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.547069 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8zx2n" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.547228 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.560710 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qksn5"] Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.610481 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-db-sync-config-data\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.610748 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-config-data\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.610823 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-combined-ca-bundle\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.611047 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxskl\" (UniqueName: \"kubernetes.io/projected/eba15328-be82-4e04-a4bf-8097322615de-kube-api-access-rxskl\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.712475 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxskl\" (UniqueName: \"kubernetes.io/projected/eba15328-be82-4e04-a4bf-8097322615de-kube-api-access-rxskl\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.712640 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-db-sync-config-data\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.712702 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-config-data\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.712736 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-combined-ca-bundle\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.718337 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-config-data\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.718332 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-combined-ca-bundle\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.720766 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-db-sync-config-data\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.721310 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"18d01a1d6d8c86ec98e530dd1c358dbc2c8522923cd8b51043a5622dc49b2c7d"} Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.729693 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxskl\" (UniqueName: \"kubernetes.io/projected/eba15328-be82-4e04-a4bf-8097322615de-kube-api-access-rxskl\") pod \"glance-db-sync-qksn5\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:35 crc kubenswrapper[4735]: I1001 10:32:35.862549 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.307702 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zmn7b" podUID="3bcc7869-f6b2-4c99-adde-40577b12c99d" containerName="ovn-controller" probeResult="failure" output=< Oct 01 10:32:36 crc kubenswrapper[4735]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 10:32:36 crc kubenswrapper[4735]: > Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.320078 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.323186 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4k7wp" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.391215 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qksn5"] Oct 01 10:32:36 crc kubenswrapper[4735]: W1001 10:32:36.404854 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeba15328_be82_4e04_a4bf_8097322615de.slice/crio-a26e068a9dfbcecc6397bc92bcb601362e9bd8ceb6bed20bd04602032c13975f WatchSource:0}: Error finding container a26e068a9dfbcecc6397bc92bcb601362e9bd8ceb6bed20bd04602032c13975f: Status 404 returned error can't find the container with id a26e068a9dfbcecc6397bc92bcb601362e9bd8ceb6bed20bd04602032c13975f Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.576219 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zmn7b-config-qk62c"] Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.577218 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.579614 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.599865 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zmn7b-config-qk62c"] Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.624716 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-scripts\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.624773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.624949 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9qzj\" (UniqueName: \"kubernetes.io/projected/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-kube-api-access-x9qzj\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.625011 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-log-ovn\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.625168 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run-ovn\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.625253 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-additional-scripts\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.726927 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-scripts\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.727000 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.727088 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9qzj\" (UniqueName: \"kubernetes.io/projected/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-kube-api-access-x9qzj\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.727122 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-log-ovn\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.727166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run-ovn\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.727198 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-additional-scripts\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.727374 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.727460 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-log-ovn\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.727510 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run-ovn\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.727966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-additional-scripts\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.728830 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-scripts\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.731820 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qksn5" event={"ID":"eba15328-be82-4e04-a4bf-8097322615de","Type":"ContainerStarted","Data":"a26e068a9dfbcecc6397bc92bcb601362e9bd8ceb6bed20bd04602032c13975f"} Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.747272 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9qzj\" (UniqueName: \"kubernetes.io/projected/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-kube-api-access-x9qzj\") pod \"ovn-controller-zmn7b-config-qk62c\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:36 crc kubenswrapper[4735]: I1001 10:32:36.893923 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:37 crc kubenswrapper[4735]: I1001 10:32:37.316021 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zmn7b-config-qk62c"] Oct 01 10:32:37 crc kubenswrapper[4735]: W1001 10:32:37.318588 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8fbdb0c_cb25_417b_9f16_c7e09fb3fbfd.slice/crio-011537470eacfbb2fd7e7827c07bb7d72fb754c7dddc80f7f520cd6157167bd2 WatchSource:0}: Error finding container 011537470eacfbb2fd7e7827c07bb7d72fb754c7dddc80f7f520cd6157167bd2: Status 404 returned error can't find the container with id 011537470eacfbb2fd7e7827c07bb7d72fb754c7dddc80f7f520cd6157167bd2 Oct 01 10:32:37 crc kubenswrapper[4735]: I1001 10:32:37.741091 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zmn7b-config-qk62c" event={"ID":"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd","Type":"ContainerStarted","Data":"011537470eacfbb2fd7e7827c07bb7d72fb754c7dddc80f7f520cd6157167bd2"} Oct 01 10:32:37 crc kubenswrapper[4735]: I1001 10:32:37.746481 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"2734f4ed9c49d2172a34c9ed3414284ba4501cfb8ee87d2c456dd4b340e86d9f"} Oct 01 10:32:38 crc kubenswrapper[4735]: I1001 10:32:38.763094 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"f1e19c6dc76406bd97c2aa155b93a798b2b14faf94376a525cf9ae01f1681475"} Oct 01 10:32:38 crc kubenswrapper[4735]: I1001 10:32:38.763772 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"357ed117c0e12b3710cefd3b50eb19d8d34ea2da9a71043d89f0054172cf4c7a"} Oct 01 10:32:38 crc kubenswrapper[4735]: I1001 10:32:38.763798 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"ac779c8476ae285bca7943d290617f26369b344cfbf19e42b95e9a3d0edbb482"} Oct 01 10:32:38 crc kubenswrapper[4735]: I1001 10:32:38.765208 4735 generic.go:334] "Generic (PLEG): container finished" podID="f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd" containerID="e42843086077517482027ea772525efff760b50a4447498be6b743665f10b9d4" exitCode=0 Oct 01 10:32:38 crc kubenswrapper[4735]: I1001 10:32:38.765247 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zmn7b-config-qk62c" event={"ID":"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd","Type":"ContainerDied","Data":"e42843086077517482027ea772525efff760b50a4447498be6b743665f10b9d4"} Oct 01 10:32:39 crc kubenswrapper[4735]: I1001 10:32:39.808147 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"651abe6c-1b2e-4652-a985-74f6cf2c7e17","Type":"ContainerStarted","Data":"6f08aa00631e8f62de99e803ce776b8a2f581f5fa02e9781683cbbecc79b8ea0"} Oct 01 10:32:39 crc kubenswrapper[4735]: I1001 10:32:39.809749 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-73ae-account-create-n72rm"] Oct 01 10:32:39 crc kubenswrapper[4735]: I1001 10:32:39.810707 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-73ae-account-create-n72rm" Oct 01 10:32:39 crc kubenswrapper[4735]: I1001 10:32:39.812298 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 01 10:32:39 crc kubenswrapper[4735]: I1001 10:32:39.825188 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-73ae-account-create-n72rm"] Oct 01 10:32:39 crc kubenswrapper[4735]: I1001 10:32:39.852423 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=23.355441516 podStartE2EDuration="27.852407021s" podCreationTimestamp="2025-10-01 10:32:12 +0000 UTC" firstStartedPulling="2025-10-01 10:32:29.9490532 +0000 UTC m=+908.641874472" lastFinishedPulling="2025-10-01 10:32:34.446018715 +0000 UTC m=+913.138839977" observedRunningTime="2025-10-01 10:32:39.848457117 +0000 UTC m=+918.541278389" watchObservedRunningTime="2025-10-01 10:32:39.852407021 +0000 UTC m=+918.545228283" Oct 01 10:32:39 crc kubenswrapper[4735]: I1001 10:32:39.878370 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4jb6\" (UniqueName: \"kubernetes.io/projected/52c13772-7115-4653-949d-ae1ef126bfd9-kube-api-access-g4jb6\") pod \"keystone-73ae-account-create-n72rm\" (UID: \"52c13772-7115-4653-949d-ae1ef126bfd9\") " pod="openstack/keystone-73ae-account-create-n72rm" Oct 01 10:32:39 crc kubenswrapper[4735]: I1001 10:32:39.980881 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4jb6\" (UniqueName: \"kubernetes.io/projected/52c13772-7115-4653-949d-ae1ef126bfd9-kube-api-access-g4jb6\") pod \"keystone-73ae-account-create-n72rm\" (UID: \"52c13772-7115-4653-949d-ae1ef126bfd9\") " pod="openstack/keystone-73ae-account-create-n72rm" Oct 01 10:32:39 crc kubenswrapper[4735]: I1001 10:32:39.999089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4jb6\" (UniqueName: \"kubernetes.io/projected/52c13772-7115-4653-949d-ae1ef126bfd9-kube-api-access-g4jb6\") pod \"keystone-73ae-account-create-n72rm\" (UID: \"52c13772-7115-4653-949d-ae1ef126bfd9\") " pod="openstack/keystone-73ae-account-create-n72rm" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.085049 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-szpsz"] Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.086280 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.093331 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.113364 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-szpsz"] Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.135093 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-73ae-account-create-n72rm" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.149985 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.184757 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-log-ovn\") pod \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.185200 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-scripts\") pod \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.185323 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run-ovn\") pod \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.185446 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9qzj\" (UniqueName: \"kubernetes.io/projected/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-kube-api-access-x9qzj\") pod \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.185540 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run\") pod \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.185015 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd" (UID: "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.185704 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd" (UID: "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.186038 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run" (OuterVolumeSpecName: "var-run") pod "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd" (UID: "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.186523 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-additional-scripts\") pod \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\" (UID: \"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd\") " Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.186761 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzfvg\" (UniqueName: \"kubernetes.io/projected/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-kube-api-access-hzfvg\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.186864 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.186976 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.187087 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.187295 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.187389 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-config\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.187646 4735 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.187751 4735 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.187855 4735 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-var-run\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.187902 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-scripts" (OuterVolumeSpecName: "scripts") pod "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd" (UID: "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.188102 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd" (UID: "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.190221 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-kube-api-access-x9qzj" (OuterVolumeSpecName: "kube-api-access-x9qzj") pod "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd" (UID: "f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd"). InnerVolumeSpecName "kube-api-access-x9qzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.289799 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzfvg\" (UniqueName: \"kubernetes.io/projected/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-kube-api-access-hzfvg\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.289874 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.289892 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.289911 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.289934 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.289952 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-config\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.290047 4735 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.290058 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.290067 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9qzj\" (UniqueName: \"kubernetes.io/projected/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd-kube-api-access-x9qzj\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.291005 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-config\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.291152 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.291555 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.292005 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.292145 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.308701 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzfvg\" (UniqueName: \"kubernetes.io/projected/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-kube-api-access-hzfvg\") pod \"dnsmasq-dns-77585f5f8c-szpsz\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.519397 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.550333 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-73ae-account-create-n72rm"] Oct 01 10:32:40 crc kubenswrapper[4735]: W1001 10:32:40.558200 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52c13772_7115_4653_949d_ae1ef126bfd9.slice/crio-b61b91e341ad00a369345ce39abee213e1577c20f3537f466b6bb676abd61c5f WatchSource:0}: Error finding container b61b91e341ad00a369345ce39abee213e1577c20f3537f466b6bb676abd61c5f: Status 404 returned error can't find the container with id b61b91e341ad00a369345ce39abee213e1577c20f3537f466b6bb676abd61c5f Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.816394 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zmn7b-config-qk62c" event={"ID":"f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd","Type":"ContainerDied","Data":"011537470eacfbb2fd7e7827c07bb7d72fb754c7dddc80f7f520cd6157167bd2"} Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.816725 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="011537470eacfbb2fd7e7827c07bb7d72fb754c7dddc80f7f520cd6157167bd2" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.816778 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zmn7b-config-qk62c" Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.820282 4735 generic.go:334] "Generic (PLEG): container finished" podID="52c13772-7115-4653-949d-ae1ef126bfd9" containerID="bac73fe606dbd822a876584978dd75f4b2c539a1dc1635d3cd614c2a4c92a23f" exitCode=0 Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.820476 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-73ae-account-create-n72rm" event={"ID":"52c13772-7115-4653-949d-ae1ef126bfd9","Type":"ContainerDied","Data":"bac73fe606dbd822a876584978dd75f4b2c539a1dc1635d3cd614c2a4c92a23f"} Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.820547 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-73ae-account-create-n72rm" event={"ID":"52c13772-7115-4653-949d-ae1ef126bfd9","Type":"ContainerStarted","Data":"b61b91e341ad00a369345ce39abee213e1577c20f3537f466b6bb676abd61c5f"} Oct 01 10:32:40 crc kubenswrapper[4735]: I1001 10:32:40.950228 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-szpsz"] Oct 01 10:32:40 crc kubenswrapper[4735]: W1001 10:32:40.954977 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2aebb4f3_5596_43e8_bbc4_dfb874bccd88.slice/crio-8c13d94dee78018b33cb18798ede432e41141bfa53203b9970af39ec2ae9d4ca WatchSource:0}: Error finding container 8c13d94dee78018b33cb18798ede432e41141bfa53203b9970af39ec2ae9d4ca: Status 404 returned error can't find the container with id 8c13d94dee78018b33cb18798ede432e41141bfa53203b9970af39ec2ae9d4ca Oct 01 10:32:41 crc kubenswrapper[4735]: I1001 10:32:41.235839 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zmn7b-config-qk62c"] Oct 01 10:32:41 crc kubenswrapper[4735]: I1001 10:32:41.245216 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zmn7b-config-qk62c"] Oct 01 10:32:41 crc kubenswrapper[4735]: I1001 10:32:41.305482 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-zmn7b" Oct 01 10:32:41 crc kubenswrapper[4735]: I1001 10:32:41.828945 4735 generic.go:334] "Generic (PLEG): container finished" podID="2aebb4f3-5596-43e8-bbc4-dfb874bccd88" containerID="76e40cc5ebbe34ad51586bfe9895327200f67365acb2545737b55f9be6a8f934" exitCode=0 Oct 01 10:32:41 crc kubenswrapper[4735]: I1001 10:32:41.829071 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" event={"ID":"2aebb4f3-5596-43e8-bbc4-dfb874bccd88","Type":"ContainerDied","Data":"76e40cc5ebbe34ad51586bfe9895327200f67365acb2545737b55f9be6a8f934"} Oct 01 10:32:41 crc kubenswrapper[4735]: I1001 10:32:41.829132 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" event={"ID":"2aebb4f3-5596-43e8-bbc4-dfb874bccd88","Type":"ContainerStarted","Data":"8c13d94dee78018b33cb18798ede432e41141bfa53203b9970af39ec2ae9d4ca"} Oct 01 10:32:41 crc kubenswrapper[4735]: I1001 10:32:41.925392 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd" path="/var/lib/kubelet/pods/f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd/volumes" Oct 01 10:32:42 crc kubenswrapper[4735]: I1001 10:32:42.845997 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" event={"ID":"2aebb4f3-5596-43e8-bbc4-dfb874bccd88","Type":"ContainerStarted","Data":"3c36bda10c0ef79e37afdadeff00655bd2951d835224a091e47cd6984653e325"} Oct 01 10:32:42 crc kubenswrapper[4735]: I1001 10:32:42.846786 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.490096 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.524074 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" podStartSLOduration=6.52405457 podStartE2EDuration="6.52405457s" podCreationTimestamp="2025-10-01 10:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:32:42.867540701 +0000 UTC m=+921.560362003" watchObservedRunningTime="2025-10-01 10:32:46.52405457 +0000 UTC m=+925.216875852" Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.720696 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.812619 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-rfhkm"] Oct 01 10:32:46 crc kubenswrapper[4735]: E1001 10:32:46.812997 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd" containerName="ovn-config" Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.813015 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd" containerName="ovn-config" Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.813180 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fbdb0c-cb25-417b-9f16-c7e09fb3fbfd" containerName="ovn-config" Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.813715 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rfhkm" Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.823808 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rfhkm"] Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.901586 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-qczl5"] Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.902841 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qczl5" Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.913337 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qczl5"] Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.995791 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gvv2\" (UniqueName: \"kubernetes.io/projected/27dc947b-08ef-428b-aa2c-9a59d002b8c3-kube-api-access-7gvv2\") pod \"barbican-db-create-qczl5\" (UID: \"27dc947b-08ef-428b-aa2c-9a59d002b8c3\") " pod="openstack/barbican-db-create-qczl5" Oct 01 10:32:46 crc kubenswrapper[4735]: I1001 10:32:46.995830 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tjtj\" (UniqueName: \"kubernetes.io/projected/f855bd65-1b1b-40e5-98b1-8772d7cb3c8b-kube-api-access-6tjtj\") pod \"cinder-db-create-rfhkm\" (UID: \"f855bd65-1b1b-40e5-98b1-8772d7cb3c8b\") " pod="openstack/cinder-db-create-rfhkm" Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.098167 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gvv2\" (UniqueName: \"kubernetes.io/projected/27dc947b-08ef-428b-aa2c-9a59d002b8c3-kube-api-access-7gvv2\") pod \"barbican-db-create-qczl5\" (UID: \"27dc947b-08ef-428b-aa2c-9a59d002b8c3\") " pod="openstack/barbican-db-create-qczl5" Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.098225 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tjtj\" (UniqueName: \"kubernetes.io/projected/f855bd65-1b1b-40e5-98b1-8772d7cb3c8b-kube-api-access-6tjtj\") pod \"cinder-db-create-rfhkm\" (UID: \"f855bd65-1b1b-40e5-98b1-8772d7cb3c8b\") " pod="openstack/cinder-db-create-rfhkm" Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.113962 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-59bzl"] Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.114948 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-59bzl" Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.123593 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gvv2\" (UniqueName: \"kubernetes.io/projected/27dc947b-08ef-428b-aa2c-9a59d002b8c3-kube-api-access-7gvv2\") pod \"barbican-db-create-qczl5\" (UID: \"27dc947b-08ef-428b-aa2c-9a59d002b8c3\") " pod="openstack/barbican-db-create-qczl5" Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.127296 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tjtj\" (UniqueName: \"kubernetes.io/projected/f855bd65-1b1b-40e5-98b1-8772d7cb3c8b-kube-api-access-6tjtj\") pod \"cinder-db-create-rfhkm\" (UID: \"f855bd65-1b1b-40e5-98b1-8772d7cb3c8b\") " pod="openstack/cinder-db-create-rfhkm" Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.132811 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-59bzl"] Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.140058 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rfhkm" Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.199782 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-488g7\" (UniqueName: \"kubernetes.io/projected/3d1cf88a-d884-4bdb-bb43-cc0a8f84da82-kube-api-access-488g7\") pod \"neutron-db-create-59bzl\" (UID: \"3d1cf88a-d884-4bdb-bb43-cc0a8f84da82\") " pod="openstack/neutron-db-create-59bzl" Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.224546 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qczl5" Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.300950 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-488g7\" (UniqueName: \"kubernetes.io/projected/3d1cf88a-d884-4bdb-bb43-cc0a8f84da82-kube-api-access-488g7\") pod \"neutron-db-create-59bzl\" (UID: \"3d1cf88a-d884-4bdb-bb43-cc0a8f84da82\") " pod="openstack/neutron-db-create-59bzl" Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.320857 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-488g7\" (UniqueName: \"kubernetes.io/projected/3d1cf88a-d884-4bdb-bb43-cc0a8f84da82-kube-api-access-488g7\") pod \"neutron-db-create-59bzl\" (UID: \"3d1cf88a-d884-4bdb-bb43-cc0a8f84da82\") " pod="openstack/neutron-db-create-59bzl" Oct 01 10:32:47 crc kubenswrapper[4735]: I1001 10:32:47.496670 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-59bzl" Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.393000 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-73ae-account-create-n72rm" Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.517287 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4jb6\" (UniqueName: \"kubernetes.io/projected/52c13772-7115-4653-949d-ae1ef126bfd9-kube-api-access-g4jb6\") pod \"52c13772-7115-4653-949d-ae1ef126bfd9\" (UID: \"52c13772-7115-4653-949d-ae1ef126bfd9\") " Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.524618 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c13772-7115-4653-949d-ae1ef126bfd9-kube-api-access-g4jb6" (OuterVolumeSpecName: "kube-api-access-g4jb6") pod "52c13772-7115-4653-949d-ae1ef126bfd9" (UID: "52c13772-7115-4653-949d-ae1ef126bfd9"). InnerVolumeSpecName "kube-api-access-g4jb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.619315 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4jb6\" (UniqueName: \"kubernetes.io/projected/52c13772-7115-4653-949d-ae1ef126bfd9-kube-api-access-g4jb6\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.843006 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rfhkm"] Oct 01 10:32:48 crc kubenswrapper[4735]: W1001 10:32:48.849970 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf855bd65_1b1b_40e5_98b1_8772d7cb3c8b.slice/crio-1154da96d29f108100a93a7b9d7b4aec56227eaeb1f7304b9c99b5a6a353d81b WatchSource:0}: Error finding container 1154da96d29f108100a93a7b9d7b4aec56227eaeb1f7304b9c99b5a6a353d81b: Status 404 returned error can't find the container with id 1154da96d29f108100a93a7b9d7b4aec56227eaeb1f7304b9c99b5a6a353d81b Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.851568 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qczl5"] Oct 01 10:32:48 crc kubenswrapper[4735]: W1001 10:32:48.852150 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27dc947b_08ef_428b_aa2c_9a59d002b8c3.slice/crio-740fd92dc5f626df98af9759f468e26862b0f2e59ab2f10dbb42aacf41f51830 WatchSource:0}: Error finding container 740fd92dc5f626df98af9759f468e26862b0f2e59ab2f10dbb42aacf41f51830: Status 404 returned error can't find the container with id 740fd92dc5f626df98af9759f468e26862b0f2e59ab2f10dbb42aacf41f51830 Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.903616 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-73ae-account-create-n72rm" Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.903622 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-73ae-account-create-n72rm" event={"ID":"52c13772-7115-4653-949d-ae1ef126bfd9","Type":"ContainerDied","Data":"b61b91e341ad00a369345ce39abee213e1577c20f3537f466b6bb676abd61c5f"} Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.903701 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b61b91e341ad00a369345ce39abee213e1577c20f3537f466b6bb676abd61c5f" Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.906763 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qczl5" event={"ID":"27dc947b-08ef-428b-aa2c-9a59d002b8c3","Type":"ContainerStarted","Data":"740fd92dc5f626df98af9759f468e26862b0f2e59ab2f10dbb42aacf41f51830"} Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.907565 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rfhkm" event={"ID":"f855bd65-1b1b-40e5-98b1-8772d7cb3c8b","Type":"ContainerStarted","Data":"1154da96d29f108100a93a7b9d7b4aec56227eaeb1f7304b9c99b5a6a353d81b"} Oct 01 10:32:48 crc kubenswrapper[4735]: I1001 10:32:48.939950 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-59bzl"] Oct 01 10:32:49 crc kubenswrapper[4735]: I1001 10:32:49.917601 4735 generic.go:334] "Generic (PLEG): container finished" podID="27dc947b-08ef-428b-aa2c-9a59d002b8c3" containerID="b2a34c63e12d94aa2b41403d524670e670a6eca614006600e38b5d5cc5c464c9" exitCode=0 Oct 01 10:32:49 crc kubenswrapper[4735]: I1001 10:32:49.917695 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qczl5" event={"ID":"27dc947b-08ef-428b-aa2c-9a59d002b8c3","Type":"ContainerDied","Data":"b2a34c63e12d94aa2b41403d524670e670a6eca614006600e38b5d5cc5c464c9"} Oct 01 10:32:49 crc kubenswrapper[4735]: I1001 10:32:49.920721 4735 generic.go:334] "Generic (PLEG): container finished" podID="3d1cf88a-d884-4bdb-bb43-cc0a8f84da82" containerID="03b034a160c6e1a708e4ca2d12ea746acd6a4333778b399011ed858edc88fed5" exitCode=0 Oct 01 10:32:49 crc kubenswrapper[4735]: I1001 10:32:49.920823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-59bzl" event={"ID":"3d1cf88a-d884-4bdb-bb43-cc0a8f84da82","Type":"ContainerDied","Data":"03b034a160c6e1a708e4ca2d12ea746acd6a4333778b399011ed858edc88fed5"} Oct 01 10:32:49 crc kubenswrapper[4735]: I1001 10:32:49.920861 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-59bzl" event={"ID":"3d1cf88a-d884-4bdb-bb43-cc0a8f84da82","Type":"ContainerStarted","Data":"21f933f0a2f55fd9119f9c67515ef71dfcf71273ffb309a27720963dd2f40894"} Oct 01 10:32:49 crc kubenswrapper[4735]: I1001 10:32:49.922880 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qksn5" event={"ID":"eba15328-be82-4e04-a4bf-8097322615de","Type":"ContainerStarted","Data":"26d0fd0f6eefd8bf55f006dee6bc13552f42c805ed7016a1dd04c3e7b62ab7a0"} Oct 01 10:32:49 crc kubenswrapper[4735]: I1001 10:32:49.924467 4735 generic.go:334] "Generic (PLEG): container finished" podID="f855bd65-1b1b-40e5-98b1-8772d7cb3c8b" containerID="0e00181e8119afb1f18c8b915d6795d31a721f53898fbdbef34cd4833c6e1055" exitCode=0 Oct 01 10:32:49 crc kubenswrapper[4735]: I1001 10:32:49.924522 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rfhkm" event={"ID":"f855bd65-1b1b-40e5-98b1-8772d7cb3c8b","Type":"ContainerDied","Data":"0e00181e8119afb1f18c8b915d6795d31a721f53898fbdbef34cd4833c6e1055"} Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.002471 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-qksn5" podStartSLOduration=2.952564774 podStartE2EDuration="15.002436676s" podCreationTimestamp="2025-10-01 10:32:35 +0000 UTC" firstStartedPulling="2025-10-01 10:32:36.406738521 +0000 UTC m=+915.099559783" lastFinishedPulling="2025-10-01 10:32:48.456610433 +0000 UTC m=+927.149431685" observedRunningTime="2025-10-01 10:32:49.988932725 +0000 UTC m=+928.681753997" watchObservedRunningTime="2025-10-01 10:32:50.002436676 +0000 UTC m=+928.695257948" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.305607 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-mkgqx"] Oct 01 10:32:50 crc kubenswrapper[4735]: E1001 10:32:50.306277 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c13772-7115-4653-949d-ae1ef126bfd9" containerName="mariadb-account-create" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.306293 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c13772-7115-4653-949d-ae1ef126bfd9" containerName="mariadb-account-create" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.306465 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c13772-7115-4653-949d-ae1ef126bfd9" containerName="mariadb-account-create" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.307011 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.309631 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.309962 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.310165 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pn2sw" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.311845 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.331742 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mkgqx"] Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.462680 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-config-data\") pod \"keystone-db-sync-mkgqx\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.462793 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcgng\" (UniqueName: \"kubernetes.io/projected/9a7ca219-5b36-4053-bc9c-92e73c7ad509-kube-api-access-mcgng\") pod \"keystone-db-sync-mkgqx\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.463006 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-combined-ca-bundle\") pod \"keystone-db-sync-mkgqx\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.520793 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.564123 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcgng\" (UniqueName: \"kubernetes.io/projected/9a7ca219-5b36-4053-bc9c-92e73c7ad509-kube-api-access-mcgng\") pod \"keystone-db-sync-mkgqx\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.564219 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-combined-ca-bundle\") pod \"keystone-db-sync-mkgqx\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.564305 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-config-data\") pod \"keystone-db-sync-mkgqx\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.571942 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-config-data\") pod \"keystone-db-sync-mkgqx\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.572943 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-combined-ca-bundle\") pod \"keystone-db-sync-mkgqx\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.582228 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcgng\" (UniqueName: \"kubernetes.io/projected/9a7ca219-5b36-4053-bc9c-92e73c7ad509-kube-api-access-mcgng\") pod \"keystone-db-sync-mkgqx\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.622195 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mf77b"] Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.622459 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-mf77b" podUID="f9f055a8-fec6-40c5-bc84-b1455265886a" containerName="dnsmasq-dns" containerID="cri-o://8b6f27691cdc4fedafcdfe885dda34d3cf783172d1ee8c703ef17fe95bcdf0dc" gracePeriod=10 Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.628993 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.939120 4735 generic.go:334] "Generic (PLEG): container finished" podID="f9f055a8-fec6-40c5-bc84-b1455265886a" containerID="8b6f27691cdc4fedafcdfe885dda34d3cf783172d1ee8c703ef17fe95bcdf0dc" exitCode=0 Oct 01 10:32:50 crc kubenswrapper[4735]: I1001 10:32:50.939640 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mf77b" event={"ID":"f9f055a8-fec6-40c5-bc84-b1455265886a","Type":"ContainerDied","Data":"8b6f27691cdc4fedafcdfe885dda34d3cf783172d1ee8c703ef17fe95bcdf0dc"} Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.020359 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.083177 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mkgqx"] Oct 01 10:32:51 crc kubenswrapper[4735]: W1001 10:32:51.083614 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a7ca219_5b36_4053_bc9c_92e73c7ad509.slice/crio-0ba696a8b7b73c101f0527f3b22f90c7d8aaac167c00285ab408ed100070d28c WatchSource:0}: Error finding container 0ba696a8b7b73c101f0527f3b22f90c7d8aaac167c00285ab408ed100070d28c: Status 404 returned error can't find the container with id 0ba696a8b7b73c101f0527f3b22f90c7d8aaac167c00285ab408ed100070d28c Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.177733 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-config\") pod \"f9f055a8-fec6-40c5-bc84-b1455265886a\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.177815 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-nb\") pod \"f9f055a8-fec6-40c5-bc84-b1455265886a\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.177854 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-dns-svc\") pod \"f9f055a8-fec6-40c5-bc84-b1455265886a\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.177880 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-sb\") pod \"f9f055a8-fec6-40c5-bc84-b1455265886a\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.177954 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94rlx\" (UniqueName: \"kubernetes.io/projected/f9f055a8-fec6-40c5-bc84-b1455265886a-kube-api-access-94rlx\") pod \"f9f055a8-fec6-40c5-bc84-b1455265886a\" (UID: \"f9f055a8-fec6-40c5-bc84-b1455265886a\") " Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.186347 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f055a8-fec6-40c5-bc84-b1455265886a-kube-api-access-94rlx" (OuterVolumeSpecName: "kube-api-access-94rlx") pod "f9f055a8-fec6-40c5-bc84-b1455265886a" (UID: "f9f055a8-fec6-40c5-bc84-b1455265886a"). InnerVolumeSpecName "kube-api-access-94rlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.202551 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rfhkm" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.243902 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9f055a8-fec6-40c5-bc84-b1455265886a" (UID: "f9f055a8-fec6-40c5-bc84-b1455265886a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.251342 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-config" (OuterVolumeSpecName: "config") pod "f9f055a8-fec6-40c5-bc84-b1455265886a" (UID: "f9f055a8-fec6-40c5-bc84-b1455265886a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.255780 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9f055a8-fec6-40c5-bc84-b1455265886a" (UID: "f9f055a8-fec6-40c5-bc84-b1455265886a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.259094 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9f055a8-fec6-40c5-bc84-b1455265886a" (UID: "f9f055a8-fec6-40c5-bc84-b1455265886a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.279415 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94rlx\" (UniqueName: \"kubernetes.io/projected/f9f055a8-fec6-40c5-bc84-b1455265886a-kube-api-access-94rlx\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.279455 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.279470 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.279479 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.279489 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f055a8-fec6-40c5-bc84-b1455265886a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.291742 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-59bzl" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.298825 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qczl5" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.380191 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tjtj\" (UniqueName: \"kubernetes.io/projected/f855bd65-1b1b-40e5-98b1-8772d7cb3c8b-kube-api-access-6tjtj\") pod \"f855bd65-1b1b-40e5-98b1-8772d7cb3c8b\" (UID: \"f855bd65-1b1b-40e5-98b1-8772d7cb3c8b\") " Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.383259 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f855bd65-1b1b-40e5-98b1-8772d7cb3c8b-kube-api-access-6tjtj" (OuterVolumeSpecName: "kube-api-access-6tjtj") pod "f855bd65-1b1b-40e5-98b1-8772d7cb3c8b" (UID: "f855bd65-1b1b-40e5-98b1-8772d7cb3c8b"). InnerVolumeSpecName "kube-api-access-6tjtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.481642 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-488g7\" (UniqueName: \"kubernetes.io/projected/3d1cf88a-d884-4bdb-bb43-cc0a8f84da82-kube-api-access-488g7\") pod \"3d1cf88a-d884-4bdb-bb43-cc0a8f84da82\" (UID: \"3d1cf88a-d884-4bdb-bb43-cc0a8f84da82\") " Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.481862 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gvv2\" (UniqueName: \"kubernetes.io/projected/27dc947b-08ef-428b-aa2c-9a59d002b8c3-kube-api-access-7gvv2\") pod \"27dc947b-08ef-428b-aa2c-9a59d002b8c3\" (UID: \"27dc947b-08ef-428b-aa2c-9a59d002b8c3\") " Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.482178 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tjtj\" (UniqueName: \"kubernetes.io/projected/f855bd65-1b1b-40e5-98b1-8772d7cb3c8b-kube-api-access-6tjtj\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.485150 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1cf88a-d884-4bdb-bb43-cc0a8f84da82-kube-api-access-488g7" (OuterVolumeSpecName: "kube-api-access-488g7") pod "3d1cf88a-d884-4bdb-bb43-cc0a8f84da82" (UID: "3d1cf88a-d884-4bdb-bb43-cc0a8f84da82"). InnerVolumeSpecName "kube-api-access-488g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.485859 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27dc947b-08ef-428b-aa2c-9a59d002b8c3-kube-api-access-7gvv2" (OuterVolumeSpecName: "kube-api-access-7gvv2") pod "27dc947b-08ef-428b-aa2c-9a59d002b8c3" (UID: "27dc947b-08ef-428b-aa2c-9a59d002b8c3"). InnerVolumeSpecName "kube-api-access-7gvv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.583915 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-488g7\" (UniqueName: \"kubernetes.io/projected/3d1cf88a-d884-4bdb-bb43-cc0a8f84da82-kube-api-access-488g7\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.583959 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gvv2\" (UniqueName: \"kubernetes.io/projected/27dc947b-08ef-428b-aa2c-9a59d002b8c3-kube-api-access-7gvv2\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.953910 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-59bzl" event={"ID":"3d1cf88a-d884-4bdb-bb43-cc0a8f84da82","Type":"ContainerDied","Data":"21f933f0a2f55fd9119f9c67515ef71dfcf71273ffb309a27720963dd2f40894"} Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.954435 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f933f0a2f55fd9119f9c67515ef71dfcf71273ffb309a27720963dd2f40894" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.954589 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-59bzl" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.958628 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rfhkm" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.958617 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rfhkm" event={"ID":"f855bd65-1b1b-40e5-98b1-8772d7cb3c8b","Type":"ContainerDied","Data":"1154da96d29f108100a93a7b9d7b4aec56227eaeb1f7304b9c99b5a6a353d81b"} Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.958857 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1154da96d29f108100a93a7b9d7b4aec56227eaeb1f7304b9c99b5a6a353d81b" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.961339 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mf77b" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.961358 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mf77b" event={"ID":"f9f055a8-fec6-40c5-bc84-b1455265886a","Type":"ContainerDied","Data":"31931eef4ab0d0b727c15bddbdfee3c3bfe9a29e3deec7c902948549bff22a54"} Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.961581 4735 scope.go:117] "RemoveContainer" containerID="8b6f27691cdc4fedafcdfe885dda34d3cf783172d1ee8c703ef17fe95bcdf0dc" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.963051 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mkgqx" event={"ID":"9a7ca219-5b36-4053-bc9c-92e73c7ad509","Type":"ContainerStarted","Data":"0ba696a8b7b73c101f0527f3b22f90c7d8aaac167c00285ab408ed100070d28c"} Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.965718 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qczl5" event={"ID":"27dc947b-08ef-428b-aa2c-9a59d002b8c3","Type":"ContainerDied","Data":"740fd92dc5f626df98af9759f468e26862b0f2e59ab2f10dbb42aacf41f51830"} Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.965756 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="740fd92dc5f626df98af9759f468e26862b0f2e59ab2f10dbb42aacf41f51830" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.966130 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qczl5" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.996915 4735 scope.go:117] "RemoveContainer" containerID="866068884d9e51446ca7596b461d363f32ee8e78d6e77b0247ee3bf1a32d52ee" Oct 01 10:32:51 crc kubenswrapper[4735]: I1001 10:32:51.998248 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mf77b"] Oct 01 10:32:52 crc kubenswrapper[4735]: I1001 10:32:52.007976 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mf77b"] Oct 01 10:32:53 crc kubenswrapper[4735]: I1001 10:32:53.907425 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f055a8-fec6-40c5-bc84-b1455265886a" path="/var/lib/kubelet/pods/f9f055a8-fec6-40c5-bc84-b1455265886a/volumes" Oct 01 10:32:56 crc kubenswrapper[4735]: I1001 10:32:56.007121 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mkgqx" event={"ID":"9a7ca219-5b36-4053-bc9c-92e73c7ad509","Type":"ContainerStarted","Data":"a2be7bfff3648b55a6637e07458d8410ce6fda444ff880112fe2736e341480d5"} Oct 01 10:32:56 crc kubenswrapper[4735]: I1001 10:32:56.029603 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-mkgqx" podStartSLOduration=1.932139252 podStartE2EDuration="6.029585551s" podCreationTimestamp="2025-10-01 10:32:50 +0000 UTC" firstStartedPulling="2025-10-01 10:32:51.085729261 +0000 UTC m=+929.778550523" lastFinishedPulling="2025-10-01 10:32:55.18317552 +0000 UTC m=+933.875996822" observedRunningTime="2025-10-01 10:32:56.022705617 +0000 UTC m=+934.715526879" watchObservedRunningTime="2025-10-01 10:32:56.029585551 +0000 UTC m=+934.722406823" Oct 01 10:32:57 crc kubenswrapper[4735]: I1001 10:32:57.017935 4735 generic.go:334] "Generic (PLEG): container finished" podID="eba15328-be82-4e04-a4bf-8097322615de" containerID="26d0fd0f6eefd8bf55f006dee6bc13552f42c805ed7016a1dd04c3e7b62ab7a0" exitCode=0 Oct 01 10:32:57 crc kubenswrapper[4735]: I1001 10:32:57.018055 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qksn5" event={"ID":"eba15328-be82-4e04-a4bf-8097322615de","Type":"ContainerDied","Data":"26d0fd0f6eefd8bf55f006dee6bc13552f42c805ed7016a1dd04c3e7b62ab7a0"} Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.409525 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.606165 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxskl\" (UniqueName: \"kubernetes.io/projected/eba15328-be82-4e04-a4bf-8097322615de-kube-api-access-rxskl\") pod \"eba15328-be82-4e04-a4bf-8097322615de\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.606261 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-db-sync-config-data\") pod \"eba15328-be82-4e04-a4bf-8097322615de\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.606286 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-config-data\") pod \"eba15328-be82-4e04-a4bf-8097322615de\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.606371 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-combined-ca-bundle\") pod \"eba15328-be82-4e04-a4bf-8097322615de\" (UID: \"eba15328-be82-4e04-a4bf-8097322615de\") " Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.612066 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "eba15328-be82-4e04-a4bf-8097322615de" (UID: "eba15328-be82-4e04-a4bf-8097322615de"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.613244 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba15328-be82-4e04-a4bf-8097322615de-kube-api-access-rxskl" (OuterVolumeSpecName: "kube-api-access-rxskl") pod "eba15328-be82-4e04-a4bf-8097322615de" (UID: "eba15328-be82-4e04-a4bf-8097322615de"). InnerVolumeSpecName "kube-api-access-rxskl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.639440 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eba15328-be82-4e04-a4bf-8097322615de" (UID: "eba15328-be82-4e04-a4bf-8097322615de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.673439 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-config-data" (OuterVolumeSpecName: "config-data") pod "eba15328-be82-4e04-a4bf-8097322615de" (UID: "eba15328-be82-4e04-a4bf-8097322615de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.708845 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxskl\" (UniqueName: \"kubernetes.io/projected/eba15328-be82-4e04-a4bf-8097322615de-kube-api-access-rxskl\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.709028 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.709370 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:58 crc kubenswrapper[4735]: I1001 10:32:58.709665 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba15328-be82-4e04-a4bf-8097322615de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.040326 4735 generic.go:334] "Generic (PLEG): container finished" podID="9a7ca219-5b36-4053-bc9c-92e73c7ad509" containerID="a2be7bfff3648b55a6637e07458d8410ce6fda444ff880112fe2736e341480d5" exitCode=0 Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.040423 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mkgqx" event={"ID":"9a7ca219-5b36-4053-bc9c-92e73c7ad509","Type":"ContainerDied","Data":"a2be7bfff3648b55a6637e07458d8410ce6fda444ff880112fe2736e341480d5"} Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.043360 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qksn5" event={"ID":"eba15328-be82-4e04-a4bf-8097322615de","Type":"ContainerDied","Data":"a26e068a9dfbcecc6397bc92bcb601362e9bd8ceb6bed20bd04602032c13975f"} Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.043407 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a26e068a9dfbcecc6397bc92bcb601362e9bd8ceb6bed20bd04602032c13975f" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.043461 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qksn5" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.430678 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-65bxv"] Oct 01 10:32:59 crc kubenswrapper[4735]: E1001 10:32:59.431264 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1cf88a-d884-4bdb-bb43-cc0a8f84da82" containerName="mariadb-database-create" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.431276 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1cf88a-d884-4bdb-bb43-cc0a8f84da82" containerName="mariadb-database-create" Oct 01 10:32:59 crc kubenswrapper[4735]: E1001 10:32:59.431290 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba15328-be82-4e04-a4bf-8097322615de" containerName="glance-db-sync" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.431299 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba15328-be82-4e04-a4bf-8097322615de" containerName="glance-db-sync" Oct 01 10:32:59 crc kubenswrapper[4735]: E1001 10:32:59.431317 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f055a8-fec6-40c5-bc84-b1455265886a" containerName="dnsmasq-dns" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.431324 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f055a8-fec6-40c5-bc84-b1455265886a" containerName="dnsmasq-dns" Oct 01 10:32:59 crc kubenswrapper[4735]: E1001 10:32:59.431332 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27dc947b-08ef-428b-aa2c-9a59d002b8c3" containerName="mariadb-database-create" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.431337 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="27dc947b-08ef-428b-aa2c-9a59d002b8c3" containerName="mariadb-database-create" Oct 01 10:32:59 crc kubenswrapper[4735]: E1001 10:32:59.431353 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f055a8-fec6-40c5-bc84-b1455265886a" containerName="init" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.431361 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f055a8-fec6-40c5-bc84-b1455265886a" containerName="init" Oct 01 10:32:59 crc kubenswrapper[4735]: E1001 10:32:59.431371 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f855bd65-1b1b-40e5-98b1-8772d7cb3c8b" containerName="mariadb-database-create" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.431376 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f855bd65-1b1b-40e5-98b1-8772d7cb3c8b" containerName="mariadb-database-create" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.431556 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="eba15328-be82-4e04-a4bf-8097322615de" containerName="glance-db-sync" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.431573 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1cf88a-d884-4bdb-bb43-cc0a8f84da82" containerName="mariadb-database-create" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.431584 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f055a8-fec6-40c5-bc84-b1455265886a" containerName="dnsmasq-dns" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.431595 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="27dc947b-08ef-428b-aa2c-9a59d002b8c3" containerName="mariadb-database-create" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.431606 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f855bd65-1b1b-40e5-98b1-8772d7cb3c8b" containerName="mariadb-database-create" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.432421 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.442637 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-65bxv"] Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.520592 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.520897 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-config\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.520934 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbgpd\" (UniqueName: \"kubernetes.io/projected/2704624e-91f8-44f8-be80-d011132f7516-kube-api-access-nbgpd\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.520960 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.521132 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.521354 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.623215 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.623289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-config\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.623317 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbgpd\" (UniqueName: \"kubernetes.io/projected/2704624e-91f8-44f8-be80-d011132f7516-kube-api-access-nbgpd\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.623353 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.623386 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.623474 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.624488 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-config\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.624642 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.624692 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.624710 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.624956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.647389 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbgpd\" (UniqueName: \"kubernetes.io/projected/2704624e-91f8-44f8-be80-d011132f7516-kube-api-access-nbgpd\") pod \"dnsmasq-dns-7ff5475cc9-65bxv\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:32:59 crc kubenswrapper[4735]: I1001 10:32:59.749706 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:33:00 crc kubenswrapper[4735]: I1001 10:33:00.201662 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-65bxv"] Oct 01 10:33:00 crc kubenswrapper[4735]: I1001 10:33:00.446724 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:33:00 crc kubenswrapper[4735]: I1001 10:33:00.638716 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-combined-ca-bundle\") pod \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " Oct 01 10:33:00 crc kubenswrapper[4735]: I1001 10:33:00.639049 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-config-data\") pod \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " Oct 01 10:33:00 crc kubenswrapper[4735]: I1001 10:33:00.639071 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcgng\" (UniqueName: \"kubernetes.io/projected/9a7ca219-5b36-4053-bc9c-92e73c7ad509-kube-api-access-mcgng\") pod \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\" (UID: \"9a7ca219-5b36-4053-bc9c-92e73c7ad509\") " Oct 01 10:33:00 crc kubenswrapper[4735]: I1001 10:33:00.650735 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7ca219-5b36-4053-bc9c-92e73c7ad509-kube-api-access-mcgng" (OuterVolumeSpecName: "kube-api-access-mcgng") pod "9a7ca219-5b36-4053-bc9c-92e73c7ad509" (UID: "9a7ca219-5b36-4053-bc9c-92e73c7ad509"). InnerVolumeSpecName "kube-api-access-mcgng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:00 crc kubenswrapper[4735]: I1001 10:33:00.671688 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a7ca219-5b36-4053-bc9c-92e73c7ad509" (UID: "9a7ca219-5b36-4053-bc9c-92e73c7ad509"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:00 crc kubenswrapper[4735]: I1001 10:33:00.699551 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-config-data" (OuterVolumeSpecName: "config-data") pod "9a7ca219-5b36-4053-bc9c-92e73c7ad509" (UID: "9a7ca219-5b36-4053-bc9c-92e73c7ad509"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:00 crc kubenswrapper[4735]: I1001 10:33:00.740782 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:00 crc kubenswrapper[4735]: I1001 10:33:00.740825 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a7ca219-5b36-4053-bc9c-92e73c7ad509-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:00 crc kubenswrapper[4735]: I1001 10:33:00.740839 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcgng\" (UniqueName: \"kubernetes.io/projected/9a7ca219-5b36-4053-bc9c-92e73c7ad509-kube-api-access-mcgng\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.063909 4735 generic.go:334] "Generic (PLEG): container finished" podID="2704624e-91f8-44f8-be80-d011132f7516" containerID="1ea0b5a324e3e9613893b30072b0e33bc1444a96924598619087ac3edbd5dece" exitCode=0 Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.063976 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" event={"ID":"2704624e-91f8-44f8-be80-d011132f7516","Type":"ContainerDied","Data":"1ea0b5a324e3e9613893b30072b0e33bc1444a96924598619087ac3edbd5dece"} Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.064007 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" event={"ID":"2704624e-91f8-44f8-be80-d011132f7516","Type":"ContainerStarted","Data":"35b600f113eb5fb9819f29ca4e1a2e5672891334b5661742f3a7f7b5a277ed3e"} Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.068998 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mkgqx" event={"ID":"9a7ca219-5b36-4053-bc9c-92e73c7ad509","Type":"ContainerDied","Data":"0ba696a8b7b73c101f0527f3b22f90c7d8aaac167c00285ab408ed100070d28c"} Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.069037 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ba696a8b7b73c101f0527f3b22f90c7d8aaac167c00285ab408ed100070d28c" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.069043 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mkgqx" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.316881 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-65bxv"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.360554 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7"] Oct 01 10:33:01 crc kubenswrapper[4735]: E1001 10:33:01.360940 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7ca219-5b36-4053-bc9c-92e73c7ad509" containerName="keystone-db-sync" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.360959 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7ca219-5b36-4053-bc9c-92e73c7ad509" containerName="keystone-db-sync" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.361138 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a7ca219-5b36-4053-bc9c-92e73c7ad509" containerName="keystone-db-sync" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.361996 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.372940 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tq997"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.374208 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.379478 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.379851 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pn2sw" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.380128 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.380213 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.380287 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.413728 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tq997"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.453834 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-config-data\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.453903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-combined-ca-bundle\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.453932 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-fernet-keys\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.453952 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-config\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.453979 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.454026 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-scripts\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.454050 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xdp\" (UniqueName: \"kubernetes.io/projected/fe773cb9-e1ff-4413-872e-bb7ae002c86b-kube-api-access-l6xdp\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.454072 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.454098 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.454202 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw7cg\" (UniqueName: \"kubernetes.io/projected/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-kube-api-access-cw7cg\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.454279 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.454325 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-credential-keys\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.518935 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-687c8c8cfc-p8gqg"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.526103 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.530319 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.530583 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-bpfpm" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.533012 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.539008 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.543872 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-687c8c8cfc-p8gqg"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.555925 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-scripts\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.555982 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xdp\" (UniqueName: \"kubernetes.io/projected/fe773cb9-e1ff-4413-872e-bb7ae002c86b-kube-api-access-l6xdp\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556015 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-config-data\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556049 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556077 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556115 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw7cg\" (UniqueName: \"kubernetes.io/projected/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-kube-api-access-cw7cg\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556142 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-scripts\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556183 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556211 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xt2n\" (UniqueName: \"kubernetes.io/projected/a580b903-ba8b-44cb-bff3-8ff737dce411-kube-api-access-2xt2n\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556243 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-credential-keys\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556284 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-config-data\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556327 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-combined-ca-bundle\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556362 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-fernet-keys\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556389 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-config\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556437 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556464 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a580b903-ba8b-44cb-bff3-8ff737dce411-horizon-secret-key\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.556540 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a580b903-ba8b-44cb-bff3-8ff737dce411-logs\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.557557 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.558940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.560357 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-config\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.561005 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.563156 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.570762 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-fernet-keys\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.570908 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-credential-keys\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.583659 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-combined-ca-bundle\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.584115 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.584244 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-config-data\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.586120 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.589311 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-scripts\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.593488 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.593643 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.594033 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xdp\" (UniqueName: \"kubernetes.io/projected/fe773cb9-e1ff-4413-872e-bb7ae002c86b-kube-api-access-l6xdp\") pod \"keystone-bootstrap-tq997\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.616320 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw7cg\" (UniqueName: \"kubernetes.io/projected/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-kube-api-access-cw7cg\") pod \"dnsmasq-dns-5c5cc7c5ff-zbzm7\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.616637 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.648387 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-t8d7j"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.649392 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.658413 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xt2n\" (UniqueName: \"kubernetes.io/projected/a580b903-ba8b-44cb-bff3-8ff737dce411-kube-api-access-2xt2n\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.658553 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a580b903-ba8b-44cb-bff3-8ff737dce411-horizon-secret-key\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.658609 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a580b903-ba8b-44cb-bff3-8ff737dce411-logs\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.658662 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-config-data\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.658711 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-scripts\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.659456 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-scripts\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.661887 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a580b903-ba8b-44cb-bff3-8ff737dce411-logs\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.662307 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-c8srt" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.662643 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.662803 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-config-data\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.662907 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.669398 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a580b903-ba8b-44cb-bff3-8ff737dce411-horizon-secret-key\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.685413 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.686052 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.698765 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-t8d7j"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.710107 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xt2n\" (UniqueName: \"kubernetes.io/projected/a580b903-ba8b-44cb-bff3-8ff737dce411-kube-api-access-2xt2n\") pod \"horizon-687c8c8cfc-p8gqg\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762715 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762758 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-scripts\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762797 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-scripts\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762819 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-log-httpd\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fxds\" (UniqueName: \"kubernetes.io/projected/c7cb175b-c967-47c2-96b0-da043b6d3506-kube-api-access-4fxds\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762855 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-config-data\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762877 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762901 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtnvx\" (UniqueName: \"kubernetes.io/projected/7ea41140-1a15-4e64-89e5-d68b9208dff1-kube-api-access-xtnvx\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762913 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7cb175b-c967-47c2-96b0-da043b6d3506-logs\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762929 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-combined-ca-bundle\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762944 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-run-httpd\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.762963 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-config-data\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.766522 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c88c854b9-2q2b2"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.780301 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.823845 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-xjfvr"] Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.826712 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.836140 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.866696 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.866768 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-scripts\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.866828 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-scripts\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.866865 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-log-httpd\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.866900 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fxds\" (UniqueName: \"kubernetes.io/projected/c7cb175b-c967-47c2-96b0-da043b6d3506-kube-api-access-4fxds\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.866926 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-config-data\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.866971 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.867016 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtnvx\" (UniqueName: \"kubernetes.io/projected/7ea41140-1a15-4e64-89e5-d68b9208dff1-kube-api-access-xtnvx\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.867044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7cb175b-c967-47c2-96b0-da043b6d3506-logs\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.867069 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-combined-ca-bundle\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.867096 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-run-httpd\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.867128 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-config-data\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.869878 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7cb175b-c967-47c2-96b0-da043b6d3506-logs\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.870226 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-run-httpd\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.883192 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.906835 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-log-httpd\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.912522 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtnvx\" (UniqueName: \"kubernetes.io/projected/7ea41140-1a15-4e64-89e5-d68b9208dff1-kube-api-access-xtnvx\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.913037 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-combined-ca-bundle\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.915464 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-scripts\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.916557 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-config-data\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.917399 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fxds\" (UniqueName: \"kubernetes.io/projected/c7cb175b-c967-47c2-96b0-da043b6d3506-kube-api-access-4fxds\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.917965 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-config-data\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.922289 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.955341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-scripts\") pod \"placement-db-sync-t8d7j\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.969726 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr62m\" (UniqueName: \"kubernetes.io/projected/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-kube-api-access-vr62m\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.969774 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-config\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.969812 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-scripts\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.969838 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-horizon-secret-key\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.969863 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.969930 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-config-data\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.969960 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.969984 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.970009 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwzv\" (UniqueName: \"kubernetes.io/projected/07675ec8-d5bc-450f-a1af-dd92e82f7696-kube-api-access-nxwzv\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.970059 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-logs\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.970104 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:01 crc kubenswrapper[4735]: I1001 10:33:01.997118 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " pod="openstack/ceilometer-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.050139 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.065401 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.069648 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c88c854b9-2q2b2"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.069776 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076101 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076200 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwzv\" (UniqueName: \"kubernetes.io/projected/07675ec8-d5bc-450f-a1af-dd92e82f7696-kube-api-access-nxwzv\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076268 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-logs\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076302 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076327 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr62m\" (UniqueName: \"kubernetes.io/projected/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-kube-api-access-vr62m\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076345 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-config\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076377 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-scripts\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076399 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-horizon-secret-key\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076419 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076438 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-config-data\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.076905 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8zx2n" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.077232 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.077568 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.077853 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-config-data\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.077882 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-logs\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.078267 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.081844 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.085266 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-config\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.093216 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-horizon-secret-key\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.093836 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.094110 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-scripts\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.097290 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr62m\" (UniqueName: \"kubernetes.io/projected/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-kube-api-access-vr62m\") pod \"horizon-7c88c854b9-2q2b2\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.098839 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.103061 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-xjfvr"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.105170 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwzv\" (UniqueName: \"kubernetes.io/projected/07675ec8-d5bc-450f-a1af-dd92e82f7696-kube-api-access-nxwzv\") pod \"dnsmasq-dns-8b5c85b87-xjfvr\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.113036 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.117068 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" event={"ID":"2704624e-91f8-44f8-be80-d011132f7516","Type":"ContainerStarted","Data":"b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a"} Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.117226 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" podUID="2704624e-91f8-44f8-be80-d011132f7516" containerName="dnsmasq-dns" containerID="cri-o://b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a" gracePeriod=10 Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.117517 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.138941 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.141046 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.141140 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.149533 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.149814 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" podStartSLOduration=3.149797005 podStartE2EDuration="3.149797005s" podCreationTimestamp="2025-10-01 10:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:02.146487436 +0000 UTC m=+940.839308698" watchObservedRunningTime="2025-10-01 10:33:02.149797005 +0000 UTC m=+940.842618267" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.177658 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.177744 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.177789 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.177933 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tj5m\" (UniqueName: \"kubernetes.io/projected/b74e1979-8744-4246-96b0-13b3fbbff698-kube-api-access-4tj5m\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.177996 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-logs\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.178033 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.178068 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.192225 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.197823 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279231 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-logs\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279282 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tj5m\" (UniqueName: \"kubernetes.io/projected/b74e1979-8744-4246-96b0-13b3fbbff698-kube-api-access-4tj5m\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279326 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-logs\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279351 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279370 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-scripts\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279399 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279414 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279444 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279474 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279569 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzmv2\" (UniqueName: \"kubernetes.io/projected/2dbb714a-be88-41cd-aa13-93fed3c7a417-kube-api-access-kzmv2\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279588 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279607 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279628 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.279657 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-config-data\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.280674 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.280683 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-logs\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.280972 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.287341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.289219 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.293951 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.302313 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tj5m\" (UniqueName: \"kubernetes.io/projected/b74e1979-8744-4246-96b0-13b3fbbff698-kube-api-access-4tj5m\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.319424 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.369684 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.380980 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzmv2\" (UniqueName: \"kubernetes.io/projected/2dbb714a-be88-41cd-aa13-93fed3c7a417-kube-api-access-kzmv2\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.381015 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.381060 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-config-data\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.381086 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-logs\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.381137 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.381157 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-scripts\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.381196 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.381356 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.382425 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.382670 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-logs\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.390296 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.391042 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-config-data\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.393884 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-scripts\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.402792 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzmv2\" (UniqueName: \"kubernetes.io/projected/2dbb714a-be88-41cd-aa13-93fed3c7a417-kube-api-access-kzmv2\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.410354 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.447166 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.469472 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.559775 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-687c8c8cfc-p8gqg"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.653286 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.715911 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tq997"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.793246 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-config\") pod \"2704624e-91f8-44f8-be80-d011132f7516\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.793360 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-sb\") pod \"2704624e-91f8-44f8-be80-d011132f7516\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.793419 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-swift-storage-0\") pod \"2704624e-91f8-44f8-be80-d011132f7516\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.793444 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-svc\") pod \"2704624e-91f8-44f8-be80-d011132f7516\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.793466 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbgpd\" (UniqueName: \"kubernetes.io/projected/2704624e-91f8-44f8-be80-d011132f7516-kube-api-access-nbgpd\") pod \"2704624e-91f8-44f8-be80-d011132f7516\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.793585 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-nb\") pod \"2704624e-91f8-44f8-be80-d011132f7516\" (UID: \"2704624e-91f8-44f8-be80-d011132f7516\") " Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.800848 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2704624e-91f8-44f8-be80-d011132f7516-kube-api-access-nbgpd" (OuterVolumeSpecName: "kube-api-access-nbgpd") pod "2704624e-91f8-44f8-be80-d011132f7516" (UID: "2704624e-91f8-44f8-be80-d011132f7516"). InnerVolumeSpecName "kube-api-access-nbgpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.847538 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c88c854b9-2q2b2"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.866450 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.874588 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-xjfvr"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.876035 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2704624e-91f8-44f8-be80-d011132f7516" (UID: "2704624e-91f8-44f8-be80-d011132f7516"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.878684 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2704624e-91f8-44f8-be80-d011132f7516" (UID: "2704624e-91f8-44f8-be80-d011132f7516"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.882972 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-t8d7j"] Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.896135 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.896186 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbgpd\" (UniqueName: \"kubernetes.io/projected/2704624e-91f8-44f8-be80-d011132f7516-kube-api-access-nbgpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.896196 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.922840 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2704624e-91f8-44f8-be80-d011132f7516" (UID: "2704624e-91f8-44f8-be80-d011132f7516"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.929037 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-config" (OuterVolumeSpecName: "config") pod "2704624e-91f8-44f8-be80-d011132f7516" (UID: "2704624e-91f8-44f8-be80-d011132f7516"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.938487 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2704624e-91f8-44f8-be80-d011132f7516" (UID: "2704624e-91f8-44f8-be80-d011132f7516"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.997944 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.997969 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:02 crc kubenswrapper[4735]: I1001 10:33:02.997983 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2704624e-91f8-44f8-be80-d011132f7516-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.084111 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.127362 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-t8d7j" event={"ID":"c7cb175b-c967-47c2-96b0-da043b6d3506","Type":"ContainerStarted","Data":"76c946918ca0b6e73121c36c73127c91f19bd2e50d94618b1316961cd197bf8f"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.131388 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687c8c8cfc-p8gqg" event={"ID":"a580b903-ba8b-44cb-bff3-8ff737dce411","Type":"ContainerStarted","Data":"66f39faa86d65dbaf4bb9a88b859ca67b4a01f82d67c07474667d7c54af8cbf7"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.136354 4735 generic.go:334] "Generic (PLEG): container finished" podID="2704624e-91f8-44f8-be80-d011132f7516" containerID="b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a" exitCode=0 Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.136417 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" event={"ID":"2704624e-91f8-44f8-be80-d011132f7516","Type":"ContainerDied","Data":"b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.136444 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" event={"ID":"2704624e-91f8-44f8-be80-d011132f7516","Type":"ContainerDied","Data":"35b600f113eb5fb9819f29ca4e1a2e5672891334b5661742f3a7f7b5a277ed3e"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.136460 4735 scope.go:117] "RemoveContainer" containerID="b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a" Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.136610 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-65bxv" Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.139377 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea41140-1a15-4e64-89e5-d68b9208dff1","Type":"ContainerStarted","Data":"e6feb6966c4d38b0eb4e8a892ac550439b1569b1d50851296b4e8560a4a6db55"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.143134 4735 generic.go:334] "Generic (PLEG): container finished" podID="07675ec8-d5bc-450f-a1af-dd92e82f7696" containerID="da4392475356db13bbea088b279054b8073d3fda2f54cc090029c533150ff6df" exitCode=0 Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.143232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" event={"ID":"07675ec8-d5bc-450f-a1af-dd92e82f7696","Type":"ContainerDied","Data":"da4392475356db13bbea088b279054b8073d3fda2f54cc090029c533150ff6df"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.143284 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" event={"ID":"07675ec8-d5bc-450f-a1af-dd92e82f7696","Type":"ContainerStarted","Data":"041f729007a1a297ac8706f5669857264b9163b5b802b96e7962cb369573a589"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.145905 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tq997" event={"ID":"fe773cb9-e1ff-4413-872e-bb7ae002c86b","Type":"ContainerStarted","Data":"d05439fcc255e8c49ca48f154e27cb80a2bfc88aaf5d787fc969ae1583ece2f9"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.145953 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tq997" event={"ID":"fe773cb9-e1ff-4413-872e-bb7ae002c86b","Type":"ContainerStarted","Data":"5addcafcfcd24c8ce81fb98f3fc4ac154fb01b7c092574cdca3a95469c988e58"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.147316 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c88c854b9-2q2b2" event={"ID":"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9","Type":"ContainerStarted","Data":"b0730ec3f02e35390c651ddf406a8bdfec64dafc6a173a790f4862099f0712dc"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.159034 4735 generic.go:334] "Generic (PLEG): container finished" podID="8bb0c796-be9f-424d-b9a0-48d6d6dd999b" containerID="08980ce4c20e1a6f8ff60856f4fac6f35712deda0c6b203b5fe32db7be6a0252" exitCode=0 Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.159095 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" event={"ID":"8bb0c796-be9f-424d-b9a0-48d6d6dd999b","Type":"ContainerDied","Data":"08980ce4c20e1a6f8ff60856f4fac6f35712deda0c6b203b5fe32db7be6a0252"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.159129 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" event={"ID":"8bb0c796-be9f-424d-b9a0-48d6d6dd999b","Type":"ContainerStarted","Data":"b5c42db8cf3b12abb46f5d4edbcb10da7688862bf820b70e66dbaee1937e0d19"} Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.211077 4735 scope.go:117] "RemoveContainer" containerID="1ea0b5a324e3e9613893b30072b0e33bc1444a96924598619087ac3edbd5dece" Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.212016 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tq997" podStartSLOduration=2.211998976 podStartE2EDuration="2.211998976s" podCreationTimestamp="2025-10-01 10:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:03.182778934 +0000 UTC m=+941.875600196" watchObservedRunningTime="2025-10-01 10:33:03.211998976 +0000 UTC m=+941.904820228" Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.211368 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.241235 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-65bxv"] Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.257836 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-65bxv"] Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.265723 4735 scope.go:117] "RemoveContainer" containerID="b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a" Oct 01 10:33:03 crc kubenswrapper[4735]: E1001 10:33:03.266189 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a\": container with ID starting with b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a not found: ID does not exist" containerID="b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a" Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.266239 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a"} err="failed to get container status \"b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a\": rpc error: code = NotFound desc = could not find container \"b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a\": container with ID starting with b61af319b5a4ccaa23c3dd0fc5a3b8e288e468f464b2e26a7835035bb687cd3a not found: ID does not exist" Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.266272 4735 scope.go:117] "RemoveContainer" containerID="1ea0b5a324e3e9613893b30072b0e33bc1444a96924598619087ac3edbd5dece" Oct 01 10:33:03 crc kubenswrapper[4735]: E1001 10:33:03.266653 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea0b5a324e3e9613893b30072b0e33bc1444a96924598619087ac3edbd5dece\": container with ID starting with 1ea0b5a324e3e9613893b30072b0e33bc1444a96924598619087ac3edbd5dece not found: ID does not exist" containerID="1ea0b5a324e3e9613893b30072b0e33bc1444a96924598619087ac3edbd5dece" Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.266689 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea0b5a324e3e9613893b30072b0e33bc1444a96924598619087ac3edbd5dece"} err="failed to get container status \"1ea0b5a324e3e9613893b30072b0e33bc1444a96924598619087ac3edbd5dece\": rpc error: code = NotFound desc = could not find container \"1ea0b5a324e3e9613893b30072b0e33bc1444a96924598619087ac3edbd5dece\": container with ID starting with 1ea0b5a324e3e9613893b30072b0e33bc1444a96924598619087ac3edbd5dece not found: ID does not exist" Oct 01 10:33:03 crc kubenswrapper[4735]: I1001 10:33:03.909367 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2704624e-91f8-44f8-be80-d011132f7516" path="/var/lib/kubelet/pods/2704624e-91f8-44f8-be80-d011132f7516/volumes" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.062261 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.173075 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dbb714a-be88-41cd-aa13-93fed3c7a417","Type":"ContainerStarted","Data":"d7b58d5fe1d875294360d37e5f3cf58ff9542d4d628c5212f11d7ae6aac558d5"} Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.173110 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dbb714a-be88-41cd-aa13-93fed3c7a417","Type":"ContainerStarted","Data":"f78ff279464cbb426665a438fc44e9aa1b72be36a7c693316047ba4bbfb9ddb9"} Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.179953 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b74e1979-8744-4246-96b0-13b3fbbff698","Type":"ContainerStarted","Data":"e9f5c7f24585fa8ee86eab3944a986db14a7d18437e2e38e048271bf0509211a"} Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.179978 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b74e1979-8744-4246-96b0-13b3fbbff698","Type":"ContainerStarted","Data":"8606bc1234e98100a57b5155da12b055e24c73db1061429db17bccdc3dcc531b"} Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.188086 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" event={"ID":"07675ec8-d5bc-450f-a1af-dd92e82f7696","Type":"ContainerStarted","Data":"fbc1d1928b1bed5b4a3c7fabed3cd6149ead79c3cccff96b2415ff3f20fd203c"} Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.189348 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.199853 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.200251 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7" event={"ID":"8bb0c796-be9f-424d-b9a0-48d6d6dd999b","Type":"ContainerDied","Data":"b5c42db8cf3b12abb46f5d4edbcb10da7688862bf820b70e66dbaee1937e0d19"} Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.200291 4735 scope.go:117] "RemoveContainer" containerID="08980ce4c20e1a6f8ff60856f4fac6f35712deda0c6b203b5fe32db7be6a0252" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.212053 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" podStartSLOduration=3.212038765 podStartE2EDuration="3.212038765s" podCreationTimestamp="2025-10-01 10:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:04.209405295 +0000 UTC m=+942.902226557" watchObservedRunningTime="2025-10-01 10:33:04.212038765 +0000 UTC m=+942.904860017" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.226443 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw7cg\" (UniqueName: \"kubernetes.io/projected/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-kube-api-access-cw7cg\") pod \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.226537 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-svc\") pod \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.226837 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-config\") pod \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.226895 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-nb\") pod \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.226929 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-swift-storage-0\") pod \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.226963 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-sb\") pod \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\" (UID: \"8bb0c796-be9f-424d-b9a0-48d6d6dd999b\") " Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.234462 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-kube-api-access-cw7cg" (OuterVolumeSpecName: "kube-api-access-cw7cg") pod "8bb0c796-be9f-424d-b9a0-48d6d6dd999b" (UID: "8bb0c796-be9f-424d-b9a0-48d6d6dd999b"). InnerVolumeSpecName "kube-api-access-cw7cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.261581 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8bb0c796-be9f-424d-b9a0-48d6d6dd999b" (UID: "8bb0c796-be9f-424d-b9a0-48d6d6dd999b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.261667 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8bb0c796-be9f-424d-b9a0-48d6d6dd999b" (UID: "8bb0c796-be9f-424d-b9a0-48d6d6dd999b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.262402 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-config" (OuterVolumeSpecName: "config") pod "8bb0c796-be9f-424d-b9a0-48d6d6dd999b" (UID: "8bb0c796-be9f-424d-b9a0-48d6d6dd999b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.265556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8bb0c796-be9f-424d-b9a0-48d6d6dd999b" (UID: "8bb0c796-be9f-424d-b9a0-48d6d6dd999b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.267085 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8bb0c796-be9f-424d-b9a0-48d6d6dd999b" (UID: "8bb0c796-be9f-424d-b9a0-48d6d6dd999b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.329391 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.329418 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.329428 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.329438 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.329446 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw7cg\" (UniqueName: \"kubernetes.io/projected/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-kube-api-access-cw7cg\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.329454 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bb0c796-be9f-424d-b9a0-48d6d6dd999b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.561511 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7"] Oct 01 10:33:04 crc kubenswrapper[4735]: I1001 10:33:04.567255 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-zbzm7"] Oct 01 10:33:04 crc kubenswrapper[4735]: E1001 10:33:04.631176 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bb0c796_be9f_424d_b9a0_48d6d6dd999b.slice/crio-b5c42db8cf3b12abb46f5d4edbcb10da7688862bf820b70e66dbaee1937e0d19\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bb0c796_be9f_424d_b9a0_48d6d6dd999b.slice\": RecentStats: unable to find data in memory cache]" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.218057 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b74e1979-8744-4246-96b0-13b3fbbff698","Type":"ContainerStarted","Data":"28aa39512cbc18af38aa34dd5f4305a56bffa436d2d1504d64b86e11da46e0c7"} Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.228305 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dbb714a-be88-41cd-aa13-93fed3c7a417","Type":"ContainerStarted","Data":"13c0d553f71e35c29e58aad707db5ba279925752f8b3f13cf13508d9f91a338d"} Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.276580 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.274332419 podStartE2EDuration="4.274332419s" podCreationTimestamp="2025-10-01 10:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:05.263835679 +0000 UTC m=+943.956656951" watchObservedRunningTime="2025-10-01 10:33:05.274332419 +0000 UTC m=+943.967153681" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.279711 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.279698103 podStartE2EDuration="4.279698103s" podCreationTimestamp="2025-10-01 10:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:05.242777716 +0000 UTC m=+943.935598988" watchObservedRunningTime="2025-10-01 10:33:05.279698103 +0000 UTC m=+943.972519365" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.385242 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.424458 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-687c8c8cfc-p8gqg"] Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.439571 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.462171 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-cd5d6d777-529hb"] Oct 01 10:33:05 crc kubenswrapper[4735]: E1001 10:33:05.462558 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb0c796-be9f-424d-b9a0-48d6d6dd999b" containerName="init" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.462574 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb0c796-be9f-424d-b9a0-48d6d6dd999b" containerName="init" Oct 01 10:33:05 crc kubenswrapper[4735]: E1001 10:33:05.462592 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2704624e-91f8-44f8-be80-d011132f7516" containerName="dnsmasq-dns" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.462599 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2704624e-91f8-44f8-be80-d011132f7516" containerName="dnsmasq-dns" Oct 01 10:33:05 crc kubenswrapper[4735]: E1001 10:33:05.462624 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2704624e-91f8-44f8-be80-d011132f7516" containerName="init" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.462630 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2704624e-91f8-44f8-be80-d011132f7516" containerName="init" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.462782 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2704624e-91f8-44f8-be80-d011132f7516" containerName="dnsmasq-dns" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.462804 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb0c796-be9f-424d-b9a0-48d6d6dd999b" containerName="init" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.463646 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.478528 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-cd5d6d777-529hb"] Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.484415 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.486749 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.486787 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.548968 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/db7fa1b4-c7fb-4b97-85be-4fec151375fa-horizon-secret-key\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.549041 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4479x\" (UniqueName: \"kubernetes.io/projected/db7fa1b4-c7fb-4b97-85be-4fec151375fa-kube-api-access-4479x\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.549079 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-scripts\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.549099 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db7fa1b4-c7fb-4b97-85be-4fec151375fa-logs\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.549450 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-config-data\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.651267 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/db7fa1b4-c7fb-4b97-85be-4fec151375fa-horizon-secret-key\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.651348 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4479x\" (UniqueName: \"kubernetes.io/projected/db7fa1b4-c7fb-4b97-85be-4fec151375fa-kube-api-access-4479x\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.651384 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-scripts\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.651400 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db7fa1b4-c7fb-4b97-85be-4fec151375fa-logs\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.651447 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-config-data\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.652188 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-scripts\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.652554 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db7fa1b4-c7fb-4b97-85be-4fec151375fa-logs\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.652946 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-config-data\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.672300 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/db7fa1b4-c7fb-4b97-85be-4fec151375fa-horizon-secret-key\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.673263 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4479x\" (UniqueName: \"kubernetes.io/projected/db7fa1b4-c7fb-4b97-85be-4fec151375fa-kube-api-access-4479x\") pod \"horizon-cd5d6d777-529hb\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.788353 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:05 crc kubenswrapper[4735]: I1001 10:33:05.909744 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb0c796-be9f-424d-b9a0-48d6d6dd999b" path="/var/lib/kubelet/pods/8bb0c796-be9f-424d-b9a0-48d6d6dd999b/volumes" Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.827611 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4552-account-create-25f9d"] Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.829001 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4552-account-create-25f9d" Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.831268 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.835411 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4552-account-create-25f9d"] Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.870487 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp5wg\" (UniqueName: \"kubernetes.io/projected/314d523b-a835-4be5-b964-108fbc40db3a-kube-api-access-hp5wg\") pod \"barbican-4552-account-create-25f9d\" (UID: \"314d523b-a835-4be5-b964-108fbc40db3a\") " pod="openstack/barbican-4552-account-create-25f9d" Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.924659 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d6e0-account-create-2pwpg"] Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.927387 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d6e0-account-create-2pwpg" Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.931717 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.937958 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d6e0-account-create-2pwpg"] Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.972456 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp5wg\" (UniqueName: \"kubernetes.io/projected/314d523b-a835-4be5-b964-108fbc40db3a-kube-api-access-hp5wg\") pod \"barbican-4552-account-create-25f9d\" (UID: \"314d523b-a835-4be5-b964-108fbc40db3a\") " pod="openstack/barbican-4552-account-create-25f9d" Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.972567 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vftj\" (UniqueName: \"kubernetes.io/projected/c97b49f0-ca3a-4433-b3c8-549fef88bfc1-kube-api-access-4vftj\") pod \"cinder-d6e0-account-create-2pwpg\" (UID: \"c97b49f0-ca3a-4433-b3c8-549fef88bfc1\") " pod="openstack/cinder-d6e0-account-create-2pwpg" Oct 01 10:33:06 crc kubenswrapper[4735]: I1001 10:33:06.990633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp5wg\" (UniqueName: \"kubernetes.io/projected/314d523b-a835-4be5-b964-108fbc40db3a-kube-api-access-hp5wg\") pod \"barbican-4552-account-create-25f9d\" (UID: \"314d523b-a835-4be5-b964-108fbc40db3a\") " pod="openstack/barbican-4552-account-create-25f9d" Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.074732 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vftj\" (UniqueName: \"kubernetes.io/projected/c97b49f0-ca3a-4433-b3c8-549fef88bfc1-kube-api-access-4vftj\") pod \"cinder-d6e0-account-create-2pwpg\" (UID: \"c97b49f0-ca3a-4433-b3c8-549fef88bfc1\") " pod="openstack/cinder-d6e0-account-create-2pwpg" Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.090312 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vftj\" (UniqueName: \"kubernetes.io/projected/c97b49f0-ca3a-4433-b3c8-549fef88bfc1-kube-api-access-4vftj\") pod \"cinder-d6e0-account-create-2pwpg\" (UID: \"c97b49f0-ca3a-4433-b3c8-549fef88bfc1\") " pod="openstack/cinder-d6e0-account-create-2pwpg" Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.154885 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4552-account-create-25f9d" Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.231095 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e0db-account-create-d2r8g"] Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.233265 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e0db-account-create-d2r8g" Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.235567 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.240336 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e0db-account-create-d2r8g"] Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.253107 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d6e0-account-create-2pwpg" Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.269097 4735 generic.go:334] "Generic (PLEG): container finished" podID="fe773cb9-e1ff-4413-872e-bb7ae002c86b" containerID="d05439fcc255e8c49ca48f154e27cb80a2bfc88aaf5d787fc969ae1583ece2f9" exitCode=0 Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.269284 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2dbb714a-be88-41cd-aa13-93fed3c7a417" containerName="glance-log" containerID="cri-o://d7b58d5fe1d875294360d37e5f3cf58ff9542d4d628c5212f11d7ae6aac558d5" gracePeriod=30 Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.269559 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tq997" event={"ID":"fe773cb9-e1ff-4413-872e-bb7ae002c86b","Type":"ContainerDied","Data":"d05439fcc255e8c49ca48f154e27cb80a2bfc88aaf5d787fc969ae1583ece2f9"} Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.270162 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b74e1979-8744-4246-96b0-13b3fbbff698" containerName="glance-log" containerID="cri-o://e9f5c7f24585fa8ee86eab3944a986db14a7d18437e2e38e048271bf0509211a" gracePeriod=30 Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.270221 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b74e1979-8744-4246-96b0-13b3fbbff698" containerName="glance-httpd" containerID="cri-o://28aa39512cbc18af38aa34dd5f4305a56bffa436d2d1504d64b86e11da46e0c7" gracePeriod=30 Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.269673 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2dbb714a-be88-41cd-aa13-93fed3c7a417" containerName="glance-httpd" containerID="cri-o://13c0d553f71e35c29e58aad707db5ba279925752f8b3f13cf13508d9f91a338d" gracePeriod=30 Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.282533 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qss77\" (UniqueName: \"kubernetes.io/projected/19f45b80-8249-470b-b45c-ecca38477609-kube-api-access-qss77\") pod \"neutron-e0db-account-create-d2r8g\" (UID: \"19f45b80-8249-470b-b45c-ecca38477609\") " pod="openstack/neutron-e0db-account-create-d2r8g" Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.384085 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qss77\" (UniqueName: \"kubernetes.io/projected/19f45b80-8249-470b-b45c-ecca38477609-kube-api-access-qss77\") pod \"neutron-e0db-account-create-d2r8g\" (UID: \"19f45b80-8249-470b-b45c-ecca38477609\") " pod="openstack/neutron-e0db-account-create-d2r8g" Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.399641 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qss77\" (UniqueName: \"kubernetes.io/projected/19f45b80-8249-470b-b45c-ecca38477609-kube-api-access-qss77\") pod \"neutron-e0db-account-create-d2r8g\" (UID: \"19f45b80-8249-470b-b45c-ecca38477609\") " pod="openstack/neutron-e0db-account-create-d2r8g" Oct 01 10:33:07 crc kubenswrapper[4735]: I1001 10:33:07.554281 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e0db-account-create-d2r8g" Oct 01 10:33:08 crc kubenswrapper[4735]: I1001 10:33:08.279350 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dbb714a-be88-41cd-aa13-93fed3c7a417" containerID="13c0d553f71e35c29e58aad707db5ba279925752f8b3f13cf13508d9f91a338d" exitCode=0 Oct 01 10:33:08 crc kubenswrapper[4735]: I1001 10:33:08.279382 4735 generic.go:334] "Generic (PLEG): container finished" podID="2dbb714a-be88-41cd-aa13-93fed3c7a417" containerID="d7b58d5fe1d875294360d37e5f3cf58ff9542d4d628c5212f11d7ae6aac558d5" exitCode=143 Oct 01 10:33:08 crc kubenswrapper[4735]: I1001 10:33:08.279431 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dbb714a-be88-41cd-aa13-93fed3c7a417","Type":"ContainerDied","Data":"13c0d553f71e35c29e58aad707db5ba279925752f8b3f13cf13508d9f91a338d"} Oct 01 10:33:08 crc kubenswrapper[4735]: I1001 10:33:08.279478 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dbb714a-be88-41cd-aa13-93fed3c7a417","Type":"ContainerDied","Data":"d7b58d5fe1d875294360d37e5f3cf58ff9542d4d628c5212f11d7ae6aac558d5"} Oct 01 10:33:08 crc kubenswrapper[4735]: I1001 10:33:08.282019 4735 generic.go:334] "Generic (PLEG): container finished" podID="b74e1979-8744-4246-96b0-13b3fbbff698" containerID="28aa39512cbc18af38aa34dd5f4305a56bffa436d2d1504d64b86e11da46e0c7" exitCode=0 Oct 01 10:33:08 crc kubenswrapper[4735]: I1001 10:33:08.282049 4735 generic.go:334] "Generic (PLEG): container finished" podID="b74e1979-8744-4246-96b0-13b3fbbff698" containerID="e9f5c7f24585fa8ee86eab3944a986db14a7d18437e2e38e048271bf0509211a" exitCode=143 Oct 01 10:33:08 crc kubenswrapper[4735]: I1001 10:33:08.282096 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b74e1979-8744-4246-96b0-13b3fbbff698","Type":"ContainerDied","Data":"28aa39512cbc18af38aa34dd5f4305a56bffa436d2d1504d64b86e11da46e0c7"} Oct 01 10:33:08 crc kubenswrapper[4735]: I1001 10:33:08.282120 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b74e1979-8744-4246-96b0-13b3fbbff698","Type":"ContainerDied","Data":"e9f5c7f24585fa8ee86eab3944a986db14a7d18437e2e38e048271bf0509211a"} Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.696345 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.785688 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-fernet-keys\") pod \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.785769 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6xdp\" (UniqueName: \"kubernetes.io/projected/fe773cb9-e1ff-4413-872e-bb7ae002c86b-kube-api-access-l6xdp\") pod \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.785841 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-config-data\") pod \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.785880 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-scripts\") pod \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.785931 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-combined-ca-bundle\") pod \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.786024 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-credential-keys\") pod \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\" (UID: \"fe773cb9-e1ff-4413-872e-bb7ae002c86b\") " Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.792546 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-scripts" (OuterVolumeSpecName: "scripts") pod "fe773cb9-e1ff-4413-872e-bb7ae002c86b" (UID: "fe773cb9-e1ff-4413-872e-bb7ae002c86b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.792979 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fe773cb9-e1ff-4413-872e-bb7ae002c86b" (UID: "fe773cb9-e1ff-4413-872e-bb7ae002c86b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.793873 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe773cb9-e1ff-4413-872e-bb7ae002c86b-kube-api-access-l6xdp" (OuterVolumeSpecName: "kube-api-access-l6xdp") pod "fe773cb9-e1ff-4413-872e-bb7ae002c86b" (UID: "fe773cb9-e1ff-4413-872e-bb7ae002c86b"). InnerVolumeSpecName "kube-api-access-l6xdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.795982 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fe773cb9-e1ff-4413-872e-bb7ae002c86b" (UID: "fe773cb9-e1ff-4413-872e-bb7ae002c86b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.816936 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-config-data" (OuterVolumeSpecName: "config-data") pod "fe773cb9-e1ff-4413-872e-bb7ae002c86b" (UID: "fe773cb9-e1ff-4413-872e-bb7ae002c86b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.822123 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe773cb9-e1ff-4413-872e-bb7ae002c86b" (UID: "fe773cb9-e1ff-4413-872e-bb7ae002c86b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.888062 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.888097 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6xdp\" (UniqueName: \"kubernetes.io/projected/fe773cb9-e1ff-4413-872e-bb7ae002c86b-kube-api-access-l6xdp\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.888112 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.888123 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.888133 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:11 crc kubenswrapper[4735]: I1001 10:33:11.888144 4735 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe773cb9-e1ff-4413-872e-bb7ae002c86b-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.195566 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c88c854b9-2q2b2"] Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.198626 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.216656 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bcff764fb-c7nmm"] Oct 01 10:33:12 crc kubenswrapper[4735]: E1001 10:33:12.217018 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe773cb9-e1ff-4413-872e-bb7ae002c86b" containerName="keystone-bootstrap" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.217033 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe773cb9-e1ff-4413-872e-bb7ae002c86b" containerName="keystone-bootstrap" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.217193 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe773cb9-e1ff-4413-872e-bb7ae002c86b" containerName="keystone-bootstrap" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.221466 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.227773 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.228584 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bcff764fb-c7nmm"] Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.297306 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-config-data\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.297349 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-tls-certs\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.297410 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcl52\" (UniqueName: \"kubernetes.io/projected/45ed423f-4895-4df3-9a04-2b916f38f57d-kube-api-access-fcl52\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.297442 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-secret-key\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.297527 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-scripts\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.297561 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-combined-ca-bundle\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.297616 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45ed423f-4895-4df3-9a04-2b916f38f57d-logs\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.303781 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-szpsz"] Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.304036 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" podUID="2aebb4f3-5596-43e8-bbc4-dfb874bccd88" containerName="dnsmasq-dns" containerID="cri-o://3c36bda10c0ef79e37afdadeff00655bd2951d835224a091e47cd6984653e325" gracePeriod=10 Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.330340 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cd5d6d777-529hb"] Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.349829 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tq997" event={"ID":"fe773cb9-e1ff-4413-872e-bb7ae002c86b","Type":"ContainerDied","Data":"5addcafcfcd24c8ce81fb98f3fc4ac154fb01b7c092574cdca3a95469c988e58"} Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.350104 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5addcafcfcd24c8ce81fb98f3fc4ac154fb01b7c092574cdca3a95469c988e58" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.350177 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tq997" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.379293 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7746dbdbf6-t6f7n"] Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.380975 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.388108 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7746dbdbf6-t6f7n"] Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.403246 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-config-data\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.403291 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-tls-certs\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.403349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcl52\" (UniqueName: \"kubernetes.io/projected/45ed423f-4895-4df3-9a04-2b916f38f57d-kube-api-access-fcl52\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.403395 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-secret-key\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.403535 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-scripts\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.403570 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-combined-ca-bundle\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.403634 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45ed423f-4895-4df3-9a04-2b916f38f57d-logs\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.404025 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45ed423f-4895-4df3-9a04-2b916f38f57d-logs\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.408446 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-scripts\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.408715 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-tls-certs\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.410642 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-config-data\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.411454 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-secret-key\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.414058 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-combined-ca-bundle\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.424105 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcl52\" (UniqueName: \"kubernetes.io/projected/45ed423f-4895-4df3-9a04-2b916f38f57d-kube-api-access-fcl52\") pod \"horizon-7bcff764fb-c7nmm\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.506003 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7353c4ca-59bc-4a50-8840-8365f90f6384-logs\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.506077 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7353c4ca-59bc-4a50-8840-8365f90f6384-horizon-secret-key\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.506121 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7353c4ca-59bc-4a50-8840-8365f90f6384-horizon-tls-certs\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.506259 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xfqh\" (UniqueName: \"kubernetes.io/projected/7353c4ca-59bc-4a50-8840-8365f90f6384-kube-api-access-6xfqh\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.506411 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7353c4ca-59bc-4a50-8840-8365f90f6384-combined-ca-bundle\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.506457 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7353c4ca-59bc-4a50-8840-8365f90f6384-config-data\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.506556 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7353c4ca-59bc-4a50-8840-8365f90f6384-scripts\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.543596 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.611314 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7353c4ca-59bc-4a50-8840-8365f90f6384-horizon-tls-certs\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.611375 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xfqh\" (UniqueName: \"kubernetes.io/projected/7353c4ca-59bc-4a50-8840-8365f90f6384-kube-api-access-6xfqh\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.611438 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7353c4ca-59bc-4a50-8840-8365f90f6384-combined-ca-bundle\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.611460 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7353c4ca-59bc-4a50-8840-8365f90f6384-config-data\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.611488 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7353c4ca-59bc-4a50-8840-8365f90f6384-scripts\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.611540 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7353c4ca-59bc-4a50-8840-8365f90f6384-logs\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.611559 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7353c4ca-59bc-4a50-8840-8365f90f6384-horizon-secret-key\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.615178 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7353c4ca-59bc-4a50-8840-8365f90f6384-logs\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.615202 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7353c4ca-59bc-4a50-8840-8365f90f6384-scripts\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.615916 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7353c4ca-59bc-4a50-8840-8365f90f6384-config-data\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.634779 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xfqh\" (UniqueName: \"kubernetes.io/projected/7353c4ca-59bc-4a50-8840-8365f90f6384-kube-api-access-6xfqh\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.638694 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7353c4ca-59bc-4a50-8840-8365f90f6384-horizon-secret-key\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.638852 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7353c4ca-59bc-4a50-8840-8365f90f6384-combined-ca-bundle\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.642514 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7353c4ca-59bc-4a50-8840-8365f90f6384-horizon-tls-certs\") pod \"horizon-7746dbdbf6-t6f7n\" (UID: \"7353c4ca-59bc-4a50-8840-8365f90f6384\") " pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.707897 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.782811 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tq997"] Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.788860 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tq997"] Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.882616 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xqbbf"] Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.883789 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.885532 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.887325 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pn2sw" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.887759 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.887808 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 10:33:12 crc kubenswrapper[4735]: I1001 10:33:12.894475 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xqbbf"] Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.017754 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6x8t\" (UniqueName: \"kubernetes.io/projected/c176cb22-cf84-4a55-aac6-6fff6792ea56-kube-api-access-h6x8t\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.017860 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-fernet-keys\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.017904 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-combined-ca-bundle\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.017944 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-config-data\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.017968 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-credential-keys\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.018005 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-scripts\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.120670 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-scripts\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.121030 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6x8t\" (UniqueName: \"kubernetes.io/projected/c176cb22-cf84-4a55-aac6-6fff6792ea56-kube-api-access-h6x8t\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.121076 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-fernet-keys\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.121108 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-combined-ca-bundle\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.121144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-config-data\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.121164 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-credential-keys\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.127800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-credential-keys\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.128104 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-combined-ca-bundle\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.128280 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-scripts\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.128802 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-config-data\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.128823 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-fernet-keys\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.140629 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6x8t\" (UniqueName: \"kubernetes.io/projected/c176cb22-cf84-4a55-aac6-6fff6792ea56-kube-api-access-h6x8t\") pod \"keystone-bootstrap-xqbbf\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.210907 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.359715 4735 generic.go:334] "Generic (PLEG): container finished" podID="2aebb4f3-5596-43e8-bbc4-dfb874bccd88" containerID="3c36bda10c0ef79e37afdadeff00655bd2951d835224a091e47cd6984653e325" exitCode=0 Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.359755 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" event={"ID":"2aebb4f3-5596-43e8-bbc4-dfb874bccd88","Type":"ContainerDied","Data":"3c36bda10c0ef79e37afdadeff00655bd2951d835224a091e47cd6984653e325"} Oct 01 10:33:13 crc kubenswrapper[4735]: I1001 10:33:13.908041 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe773cb9-e1ff-4413-872e-bb7ae002c86b" path="/var/lib/kubelet/pods/fe773cb9-e1ff-4413-872e-bb7ae002c86b/volumes" Oct 01 10:33:15 crc kubenswrapper[4735]: I1001 10:33:15.520867 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" podUID="2aebb4f3-5596-43e8-bbc4-dfb874bccd88" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Oct 01 10:33:16 crc kubenswrapper[4735]: E1001 10:33:16.295689 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 01 10:33:16 crc kubenswrapper[4735]: E1001 10:33:16.296199 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4fxds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-t8d7j_openstack(c7cb175b-c967-47c2-96b0-da043b6d3506): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 10:33:16 crc kubenswrapper[4735]: E1001 10:33:16.298037 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-t8d7j" podUID="c7cb175b-c967-47c2-96b0-da043b6d3506" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.393617 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b74e1979-8744-4246-96b0-13b3fbbff698","Type":"ContainerDied","Data":"8606bc1234e98100a57b5155da12b055e24c73db1061429db17bccdc3dcc531b"} Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.393667 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8606bc1234e98100a57b5155da12b055e24c73db1061429db17bccdc3dcc531b" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.440835 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.589948 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-scripts\") pod \"b74e1979-8744-4246-96b0-13b3fbbff698\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.590057 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-logs\") pod \"b74e1979-8744-4246-96b0-13b3fbbff698\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.590149 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tj5m\" (UniqueName: \"kubernetes.io/projected/b74e1979-8744-4246-96b0-13b3fbbff698-kube-api-access-4tj5m\") pod \"b74e1979-8744-4246-96b0-13b3fbbff698\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.590235 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-config-data\") pod \"b74e1979-8744-4246-96b0-13b3fbbff698\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.590256 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-combined-ca-bundle\") pod \"b74e1979-8744-4246-96b0-13b3fbbff698\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.590324 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b74e1979-8744-4246-96b0-13b3fbbff698\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.590391 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-httpd-run\") pod \"b74e1979-8744-4246-96b0-13b3fbbff698\" (UID: \"b74e1979-8744-4246-96b0-13b3fbbff698\") " Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.591201 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-logs" (OuterVolumeSpecName: "logs") pod "b74e1979-8744-4246-96b0-13b3fbbff698" (UID: "b74e1979-8744-4246-96b0-13b3fbbff698"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.591533 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b74e1979-8744-4246-96b0-13b3fbbff698" (UID: "b74e1979-8744-4246-96b0-13b3fbbff698"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.596213 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74e1979-8744-4246-96b0-13b3fbbff698-kube-api-access-4tj5m" (OuterVolumeSpecName: "kube-api-access-4tj5m") pod "b74e1979-8744-4246-96b0-13b3fbbff698" (UID: "b74e1979-8744-4246-96b0-13b3fbbff698"). InnerVolumeSpecName "kube-api-access-4tj5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.596324 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-scripts" (OuterVolumeSpecName: "scripts") pod "b74e1979-8744-4246-96b0-13b3fbbff698" (UID: "b74e1979-8744-4246-96b0-13b3fbbff698"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.596378 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "b74e1979-8744-4246-96b0-13b3fbbff698" (UID: "b74e1979-8744-4246-96b0-13b3fbbff698"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 10:33:16 crc kubenswrapper[4735]: E1001 10:33:16.615197 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-t8d7j" podUID="c7cb175b-c967-47c2-96b0-da043b6d3506" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.623278 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b74e1979-8744-4246-96b0-13b3fbbff698" (UID: "b74e1979-8744-4246-96b0-13b3fbbff698"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.644332 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-config-data" (OuterVolumeSpecName: "config-data") pod "b74e1979-8744-4246-96b0-13b3fbbff698" (UID: "b74e1979-8744-4246-96b0-13b3fbbff698"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.694614 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.694673 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.694685 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.694694 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b74e1979-8744-4246-96b0-13b3fbbff698-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.694731 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tj5m\" (UniqueName: \"kubernetes.io/projected/b74e1979-8744-4246-96b0-13b3fbbff698-kube-api-access-4tj5m\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.694741 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.694749 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74e1979-8744-4246-96b0-13b3fbbff698-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.734374 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.799078 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.831581 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cd5d6d777-529hb"] Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.893334 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 10:33:16 crc kubenswrapper[4735]: I1001 10:33:16.974050 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.002652 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-logs\") pod \"2dbb714a-be88-41cd-aa13-93fed3c7a417\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.002780 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzmv2\" (UniqueName: \"kubernetes.io/projected/2dbb714a-be88-41cd-aa13-93fed3c7a417-kube-api-access-kzmv2\") pod \"2dbb714a-be88-41cd-aa13-93fed3c7a417\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.002954 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-config-data\") pod \"2dbb714a-be88-41cd-aa13-93fed3c7a417\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.003163 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-logs" (OuterVolumeSpecName: "logs") pod "2dbb714a-be88-41cd-aa13-93fed3c7a417" (UID: "2dbb714a-be88-41cd-aa13-93fed3c7a417"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.004177 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"2dbb714a-be88-41cd-aa13-93fed3c7a417\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.004283 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-scripts\") pod \"2dbb714a-be88-41cd-aa13-93fed3c7a417\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.004324 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-httpd-run\") pod \"2dbb714a-be88-41cd-aa13-93fed3c7a417\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.004372 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-combined-ca-bundle\") pod \"2dbb714a-be88-41cd-aa13-93fed3c7a417\" (UID: \"2dbb714a-be88-41cd-aa13-93fed3c7a417\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.004983 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2dbb714a-be88-41cd-aa13-93fed3c7a417" (UID: "2dbb714a-be88-41cd-aa13-93fed3c7a417"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.006328 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.006373 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dbb714a-be88-41cd-aa13-93fed3c7a417-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.019915 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dbb714a-be88-41cd-aa13-93fed3c7a417-kube-api-access-kzmv2" (OuterVolumeSpecName: "kube-api-access-kzmv2") pod "2dbb714a-be88-41cd-aa13-93fed3c7a417" (UID: "2dbb714a-be88-41cd-aa13-93fed3c7a417"). InnerVolumeSpecName "kube-api-access-kzmv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.021042 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-scripts" (OuterVolumeSpecName: "scripts") pod "2dbb714a-be88-41cd-aa13-93fed3c7a417" (UID: "2dbb714a-be88-41cd-aa13-93fed3c7a417"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.021431 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "2dbb714a-be88-41cd-aa13-93fed3c7a417" (UID: "2dbb714a-be88-41cd-aa13-93fed3c7a417"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.082733 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dbb714a-be88-41cd-aa13-93fed3c7a417" (UID: "2dbb714a-be88-41cd-aa13-93fed3c7a417"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.107321 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-nb\") pod \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.107429 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-sb\") pod \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.107465 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-svc\") pod \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.107550 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-swift-storage-0\") pod \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.107593 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzfvg\" (UniqueName: \"kubernetes.io/projected/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-kube-api-access-hzfvg\") pod \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.107654 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-config\") pod \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\" (UID: \"2aebb4f3-5596-43e8-bbc4-dfb874bccd88\") " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.108065 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.108083 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.108093 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.108104 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzmv2\" (UniqueName: \"kubernetes.io/projected/2dbb714a-be88-41cd-aa13-93fed3c7a417-kube-api-access-kzmv2\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.114571 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-config-data" (OuterVolumeSpecName: "config-data") pod "2dbb714a-be88-41cd-aa13-93fed3c7a417" (UID: "2dbb714a-be88-41cd-aa13-93fed3c7a417"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.117402 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-kube-api-access-hzfvg" (OuterVolumeSpecName: "kube-api-access-hzfvg") pod "2aebb4f3-5596-43e8-bbc4-dfb874bccd88" (UID: "2aebb4f3-5596-43e8-bbc4-dfb874bccd88"). InnerVolumeSpecName "kube-api-access-hzfvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.128794 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.147117 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-config" (OuterVolumeSpecName: "config") pod "2aebb4f3-5596-43e8-bbc4-dfb874bccd88" (UID: "2aebb4f3-5596-43e8-bbc4-dfb874bccd88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.154670 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2aebb4f3-5596-43e8-bbc4-dfb874bccd88" (UID: "2aebb4f3-5596-43e8-bbc4-dfb874bccd88"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.154691 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2aebb4f3-5596-43e8-bbc4-dfb874bccd88" (UID: "2aebb4f3-5596-43e8-bbc4-dfb874bccd88"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.156376 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2aebb4f3-5596-43e8-bbc4-dfb874bccd88" (UID: "2aebb4f3-5596-43e8-bbc4-dfb874bccd88"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.171592 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2aebb4f3-5596-43e8-bbc4-dfb874bccd88" (UID: "2aebb4f3-5596-43e8-bbc4-dfb874bccd88"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.184953 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e0db-account-create-d2r8g"] Oct 01 10:33:17 crc kubenswrapper[4735]: W1001 10:33:17.187280 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19f45b80_8249_470b_b45c_ecca38477609.slice/crio-34fbb51d8685bbd8b4eeb7048631e7eeb7cd5e7e8e766bd45bc9e6efc6bcbcde WatchSource:0}: Error finding container 34fbb51d8685bbd8b4eeb7048631e7eeb7cd5e7e8e766bd45bc9e6efc6bcbcde: Status 404 returned error can't find the container with id 34fbb51d8685bbd8b4eeb7048631e7eeb7cd5e7e8e766bd45bc9e6efc6bcbcde Oct 01 10:33:17 crc kubenswrapper[4735]: W1001 10:33:17.209115 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45ed423f_4895_4df3_9a04_2b916f38f57d.slice/crio-1722000fc5c11369090132cb41a4718d023583ec4f6678d0c5f1663aa4f1afc4 WatchSource:0}: Error finding container 1722000fc5c11369090132cb41a4718d023583ec4f6678d0c5f1663aa4f1afc4: Status 404 returned error can't find the container with id 1722000fc5c11369090132cb41a4718d023583ec4f6678d0c5f1663aa4f1afc4 Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.210020 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.210094 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dbb714a-be88-41cd-aa13-93fed3c7a417-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.210104 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.210113 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.210389 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.210428 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.210593 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.210777 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzfvg\" (UniqueName: \"kubernetes.io/projected/2aebb4f3-5596-43e8-bbc4-dfb874bccd88-kube-api-access-hzfvg\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.212880 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bcff764fb-c7nmm"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.330090 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4552-account-create-25f9d"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.339762 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7746dbdbf6-t6f7n"] Oct 01 10:33:17 crc kubenswrapper[4735]: W1001 10:33:17.340319 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7353c4ca_59bc_4a50_8840_8365f90f6384.slice/crio-2a9255e6563d55c8ba24f0125ee01dc19a62a97de825637870af63a6cd70cfb4 WatchSource:0}: Error finding container 2a9255e6563d55c8ba24f0125ee01dc19a62a97de825637870af63a6cd70cfb4: Status 404 returned error can't find the container with id 2a9255e6563d55c8ba24f0125ee01dc19a62a97de825637870af63a6cd70cfb4 Oct 01 10:33:17 crc kubenswrapper[4735]: W1001 10:33:17.340590 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod314d523b_a835_4be5_b964_108fbc40db3a.slice/crio-cc5240ee1e146582cd5993c30ad4d405b83b28214d6188c7e7d02e50d6919454 WatchSource:0}: Error finding container cc5240ee1e146582cd5993c30ad4d405b83b28214d6188c7e7d02e50d6919454: Status 404 returned error can't find the container with id cc5240ee1e146582cd5993c30ad4d405b83b28214d6188c7e7d02e50d6919454 Oct 01 10:33:17 crc kubenswrapper[4735]: W1001 10:33:17.344867 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc176cb22_cf84_4a55_aac6_6fff6792ea56.slice/crio-4fc78d1807c99c49a2d5bbb7f70b026cc2c53bc6af98a30b7a4ab408202b2e2e WatchSource:0}: Error finding container 4fc78d1807c99c49a2d5bbb7f70b026cc2c53bc6af98a30b7a4ab408202b2e2e: Status 404 returned error can't find the container with id 4fc78d1807c99c49a2d5bbb7f70b026cc2c53bc6af98a30b7a4ab408202b2e2e Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.351288 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xqbbf"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.404520 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqbbf" event={"ID":"c176cb22-cf84-4a55-aac6-6fff6792ea56","Type":"ContainerStarted","Data":"4fc78d1807c99c49a2d5bbb7f70b026cc2c53bc6af98a30b7a4ab408202b2e2e"} Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.405741 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cd5d6d777-529hb" event={"ID":"db7fa1b4-c7fb-4b97-85be-4fec151375fa","Type":"ContainerStarted","Data":"8ee347a787b48b4ce4b7a1980acfd67a24c0b120e1bdc0c840b6dd75db444a69"} Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.408005 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" event={"ID":"2aebb4f3-5596-43e8-bbc4-dfb874bccd88","Type":"ContainerDied","Data":"8c13d94dee78018b33cb18798ede432e41141bfa53203b9970af39ec2ae9d4ca"} Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.408051 4735 scope.go:117] "RemoveContainer" containerID="3c36bda10c0ef79e37afdadeff00655bd2951d835224a091e47cd6984653e325" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.408020 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-szpsz" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.409188 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7746dbdbf6-t6f7n" event={"ID":"7353c4ca-59bc-4a50-8840-8365f90f6384","Type":"ContainerStarted","Data":"2a9255e6563d55c8ba24f0125ee01dc19a62a97de825637870af63a6cd70cfb4"} Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.411771 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4552-account-create-25f9d" event={"ID":"314d523b-a835-4be5-b964-108fbc40db3a","Type":"ContainerStarted","Data":"cc5240ee1e146582cd5993c30ad4d405b83b28214d6188c7e7d02e50d6919454"} Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.412810 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e0db-account-create-d2r8g" event={"ID":"19f45b80-8249-470b-b45c-ecca38477609","Type":"ContainerStarted","Data":"34fbb51d8685bbd8b4eeb7048631e7eeb7cd5e7e8e766bd45bc9e6efc6bcbcde"} Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.414767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dbb714a-be88-41cd-aa13-93fed3c7a417","Type":"ContainerDied","Data":"f78ff279464cbb426665a438fc44e9aa1b72be36a7c693316047ba4bbfb9ddb9"} Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.414837 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.417474 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.417566 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bcff764fb-c7nmm" event={"ID":"45ed423f-4895-4df3-9a04-2b916f38f57d","Type":"ContainerStarted","Data":"1722000fc5c11369090132cb41a4718d023583ec4f6678d0c5f1663aa4f1afc4"} Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.428148 4735 scope.go:117] "RemoveContainer" containerID="76e40cc5ebbe34ad51586bfe9895327200f67365acb2545737b55f9be6a8f934" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.456705 4735 scope.go:117] "RemoveContainer" containerID="13c0d553f71e35c29e58aad707db5ba279925752f8b3f13cf13508d9f91a338d" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.476558 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.482727 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.512040 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d6e0-account-create-2pwpg"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.530564 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-szpsz"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.537298 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-szpsz"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.547052 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.551807 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:33:17 crc kubenswrapper[4735]: E1001 10:33:17.552199 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aebb4f3-5596-43e8-bbc4-dfb874bccd88" containerName="init" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.552211 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aebb4f3-5596-43e8-bbc4-dfb874bccd88" containerName="init" Oct 01 10:33:17 crc kubenswrapper[4735]: E1001 10:33:17.552224 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74e1979-8744-4246-96b0-13b3fbbff698" containerName="glance-httpd" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.552229 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74e1979-8744-4246-96b0-13b3fbbff698" containerName="glance-httpd" Oct 01 10:33:17 crc kubenswrapper[4735]: E1001 10:33:17.552239 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dbb714a-be88-41cd-aa13-93fed3c7a417" containerName="glance-log" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.552245 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dbb714a-be88-41cd-aa13-93fed3c7a417" containerName="glance-log" Oct 01 10:33:17 crc kubenswrapper[4735]: E1001 10:33:17.552257 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dbb714a-be88-41cd-aa13-93fed3c7a417" containerName="glance-httpd" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.552262 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dbb714a-be88-41cd-aa13-93fed3c7a417" containerName="glance-httpd" Oct 01 10:33:17 crc kubenswrapper[4735]: E1001 10:33:17.552273 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aebb4f3-5596-43e8-bbc4-dfb874bccd88" containerName="dnsmasq-dns" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.552280 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aebb4f3-5596-43e8-bbc4-dfb874bccd88" containerName="dnsmasq-dns" Oct 01 10:33:17 crc kubenswrapper[4735]: E1001 10:33:17.552298 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74e1979-8744-4246-96b0-13b3fbbff698" containerName="glance-log" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.552304 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74e1979-8744-4246-96b0-13b3fbbff698" containerName="glance-log" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.552469 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dbb714a-be88-41cd-aa13-93fed3c7a417" containerName="glance-httpd" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.552479 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b74e1979-8744-4246-96b0-13b3fbbff698" containerName="glance-httpd" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.552519 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dbb714a-be88-41cd-aa13-93fed3c7a417" containerName="glance-log" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.552533 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b74e1979-8744-4246-96b0-13b3fbbff698" containerName="glance-log" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.552546 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aebb4f3-5596-43e8-bbc4-dfb874bccd88" containerName="dnsmasq-dns" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.553465 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.556707 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.556931 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.557042 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8zx2n" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.557981 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.558919 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.564646 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.573972 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.575458 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.575836 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.577942 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.581467 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719140 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719186 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719212 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719368 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719416 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719556 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719706 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719766 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-logs\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719806 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx2z7\" (UniqueName: \"kubernetes.io/projected/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-kube-api-access-fx2z7\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719859 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719894 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.719968 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.720069 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6qpd\" (UniqueName: \"kubernetes.io/projected/4c4185de-864c-4329-a2b2-bc7164bb33c6-kube-api-access-x6qpd\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.720197 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.720224 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-logs\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.821945 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.821992 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6qpd\" (UniqueName: \"kubernetes.io/projected/4c4185de-864c-4329-a2b2-bc7164bb33c6-kube-api-access-x6qpd\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.822074 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.822101 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-logs\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.822880 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-logs\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.822919 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.822945 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.822969 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.822993 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.823053 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.823078 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.823149 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.823245 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.823298 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-logs\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.823333 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx2z7\" (UniqueName: \"kubernetes.io/projected/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-kube-api-access-fx2z7\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.823405 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.823446 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.823834 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.824669 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.824583 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-logs\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.824746 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.824425 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.830185 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.832985 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.834546 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.835335 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.835783 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.836727 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.839114 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.839490 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.848528 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx2z7\" (UniqueName: \"kubernetes.io/projected/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-kube-api-access-fx2z7\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.856731 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6qpd\" (UniqueName: \"kubernetes.io/projected/4c4185de-864c-4329-a2b2-bc7164bb33c6-kube-api-access-x6qpd\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.882336 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.888925 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.905566 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.915782 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.917240 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aebb4f3-5596-43e8-bbc4-dfb874bccd88" path="/var/lib/kubelet/pods/2aebb4f3-5596-43e8-bbc4-dfb874bccd88/volumes" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.918231 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dbb714a-be88-41cd-aa13-93fed3c7a417" path="/var/lib/kubelet/pods/2dbb714a-be88-41cd-aa13-93fed3c7a417/volumes" Oct 01 10:33:17 crc kubenswrapper[4735]: I1001 10:33:17.918976 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74e1979-8744-4246-96b0-13b3fbbff698" path="/var/lib/kubelet/pods/b74e1979-8744-4246-96b0-13b3fbbff698/volumes" Oct 01 10:33:18 crc kubenswrapper[4735]: I1001 10:33:18.014830 4735 scope.go:117] "RemoveContainer" containerID="d7b58d5fe1d875294360d37e5f3cf58ff9542d4d628c5212f11d7ae6aac558d5" Oct 01 10:33:18 crc kubenswrapper[4735]: I1001 10:33:18.430276 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d6e0-account-create-2pwpg" event={"ID":"c97b49f0-ca3a-4433-b3c8-549fef88bfc1","Type":"ContainerStarted","Data":"09142c16df14ff1332156901b97cccc8a6e352ab8d15535f425d8c98231caaad"} Oct 01 10:33:18 crc kubenswrapper[4735]: I1001 10:33:18.586918 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:33:18 crc kubenswrapper[4735]: W1001 10:33:18.590594 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde8a0fb9_bd5f_4a80_83dd_aa2eda4fc1d4.slice/crio-a2bcce01642bc2a310aabc8a548b7fabe099230a2ef08a892328d57e415f1b83 WatchSource:0}: Error finding container a2bcce01642bc2a310aabc8a548b7fabe099230a2ef08a892328d57e415f1b83: Status 404 returned error can't find the container with id a2bcce01642bc2a310aabc8a548b7fabe099230a2ef08a892328d57e415f1b83 Oct 01 10:33:18 crc kubenswrapper[4735]: I1001 10:33:18.701806 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:33:19 crc kubenswrapper[4735]: I1001 10:33:19.440836 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4","Type":"ContainerStarted","Data":"a2bcce01642bc2a310aabc8a548b7fabe099230a2ef08a892328d57e415f1b83"} Oct 01 10:33:19 crc kubenswrapper[4735]: I1001 10:33:19.442089 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c4185de-864c-4329-a2b2-bc7164bb33c6","Type":"ContainerStarted","Data":"200e953a57027f5f67060a357e5c0f987f708b621186cb468ef16b72ae86efd1"} Oct 01 10:33:20 crc kubenswrapper[4735]: I1001 10:33:20.454560 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea41140-1a15-4e64-89e5-d68b9208dff1","Type":"ContainerStarted","Data":"8e11edbb3a51634f53489e50353ab8eca5b4e68b02bec161bef6c00e771954b1"} Oct 01 10:33:20 crc kubenswrapper[4735]: I1001 10:33:20.456964 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cd5d6d777-529hb" event={"ID":"db7fa1b4-c7fb-4b97-85be-4fec151375fa","Type":"ContainerStarted","Data":"dc1f2ab489b9c41106f18065904e23f2a61fe118fb0a8eb2a0fd916439f99627"} Oct 01 10:33:20 crc kubenswrapper[4735]: I1001 10:33:20.458377 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c88c854b9-2q2b2" event={"ID":"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9","Type":"ContainerStarted","Data":"2f41577d47c185ffd16c8184feeda46c83f713c61adeb5be19109cfa62eb629a"} Oct 01 10:33:20 crc kubenswrapper[4735]: I1001 10:33:20.459561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687c8c8cfc-p8gqg" event={"ID":"a580b903-ba8b-44cb-bff3-8ff737dce411","Type":"ContainerStarted","Data":"3eede71fcdb5b04f262f296e20150d0aebde6252b04fd29c856e458faf3b4d1a"} Oct 01 10:33:21 crc kubenswrapper[4735]: I1001 10:33:21.469562 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e0db-account-create-d2r8g" event={"ID":"19f45b80-8249-470b-b45c-ecca38477609","Type":"ContainerStarted","Data":"f24ed144ef04ba958cbee6c962d667d0f9ea158eb7e889438e1e365107c66e21"} Oct 01 10:33:21 crc kubenswrapper[4735]: I1001 10:33:21.471642 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bcff764fb-c7nmm" event={"ID":"45ed423f-4895-4df3-9a04-2b916f38f57d","Type":"ContainerStarted","Data":"3c26fba4b03c5d2ead52a8171b3d00ed23efc77ca18ab4fe450fd043e2e25cb4"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.498674 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bcff764fb-c7nmm" event={"ID":"45ed423f-4895-4df3-9a04-2b916f38f57d","Type":"ContainerStarted","Data":"7a05d0208176e16ca827de0751205940fe05ef3bd18c2ddf1872862f95be80ef"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.503248 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cd5d6d777-529hb" event={"ID":"db7fa1b4-c7fb-4b97-85be-4fec151375fa","Type":"ContainerStarted","Data":"4f6d808868f636be2719fc1bec51ec6244cd5da7cb855be99c24add393175bf4"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.503388 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-cd5d6d777-529hb" podUID="db7fa1b4-c7fb-4b97-85be-4fec151375fa" containerName="horizon-log" containerID="cri-o://dc1f2ab489b9c41106f18065904e23f2a61fe118fb0a8eb2a0fd916439f99627" gracePeriod=30 Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.503578 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-cd5d6d777-529hb" podUID="db7fa1b4-c7fb-4b97-85be-4fec151375fa" containerName="horizon" containerID="cri-o://4f6d808868f636be2719fc1bec51ec6244cd5da7cb855be99c24add393175bf4" gracePeriod=30 Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.508286 4735 generic.go:334] "Generic (PLEG): container finished" podID="314d523b-a835-4be5-b964-108fbc40db3a" containerID="55117a38017e670a1b2932fdcb18de07cf6303845e8d97559e98dcb0242b0369" exitCode=0 Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.508329 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4552-account-create-25f9d" event={"ID":"314d523b-a835-4be5-b964-108fbc40db3a","Type":"ContainerDied","Data":"55117a38017e670a1b2932fdcb18de07cf6303845e8d97559e98dcb0242b0369"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.514961 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4","Type":"ContainerStarted","Data":"4c27190f8b3f02d64f090187bca72a8c38246d3cfc9366619beef07e151a043b"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.514991 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4","Type":"ContainerStarted","Data":"29221342824c12498a2445acd6150e30aae7f82799dfded43e139a3d552e1f47"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.523417 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7bcff764fb-c7nmm" podStartSLOduration=10.52340041 podStartE2EDuration="10.52340041s" podCreationTimestamp="2025-10-01 10:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:22.518882678 +0000 UTC m=+961.211703940" watchObservedRunningTime="2025-10-01 10:33:22.52340041 +0000 UTC m=+961.216221672" Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.527531 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqbbf" event={"ID":"c176cb22-cf84-4a55-aac6-6fff6792ea56","Type":"ContainerStarted","Data":"19eea9f23b739e7a0e61f0315d3d8dc06408a21cadf84144922df57633f716f0"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.530514 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7746dbdbf6-t6f7n" event={"ID":"7353c4ca-59bc-4a50-8840-8365f90f6384","Type":"ContainerStarted","Data":"8307f0ba3a5277e4d4d0db6c3a23176991098bc589ae9eaccb566c243c686838"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.530548 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7746dbdbf6-t6f7n" event={"ID":"7353c4ca-59bc-4a50-8840-8365f90f6384","Type":"ContainerStarted","Data":"af09e33379c2bf908b99f004524beeb0ab2579bc6f1f68610b48156dbf5fb9fa"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.533204 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c88c854b9-2q2b2" event={"ID":"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9","Type":"ContainerStarted","Data":"f45297ae1821283c10e00cef6bdb2567eab5061fda027dbd18717bb611d28ac6"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.533380 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c88c854b9-2q2b2" podUID="dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" containerName="horizon-log" containerID="cri-o://2f41577d47c185ffd16c8184feeda46c83f713c61adeb5be19109cfa62eb629a" gracePeriod=30 Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.533728 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c88c854b9-2q2b2" podUID="dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" containerName="horizon" containerID="cri-o://f45297ae1821283c10e00cef6bdb2567eab5061fda027dbd18717bb611d28ac6" gracePeriod=30 Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.543710 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.543752 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.543994 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687c8c8cfc-p8gqg" event={"ID":"a580b903-ba8b-44cb-bff3-8ff737dce411","Type":"ContainerStarted","Data":"28770dfd9a0f891204b7de9f740e8148c5335e16103b1203773b90f797591286"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.544140 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-687c8c8cfc-p8gqg" podUID="a580b903-ba8b-44cb-bff3-8ff737dce411" containerName="horizon-log" containerID="cri-o://3eede71fcdb5b04f262f296e20150d0aebde6252b04fd29c856e458faf3b4d1a" gracePeriod=30 Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.544171 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-687c8c8cfc-p8gqg" podUID="a580b903-ba8b-44cb-bff3-8ff737dce411" containerName="horizon" containerID="cri-o://28770dfd9a0f891204b7de9f740e8148c5335e16103b1203773b90f797591286" gracePeriod=30 Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.544619 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-cd5d6d777-529hb" podStartSLOduration=17.544609497 podStartE2EDuration="17.544609497s" podCreationTimestamp="2025-10-01 10:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:22.538696819 +0000 UTC m=+961.231518101" watchObservedRunningTime="2025-10-01 10:33:22.544609497 +0000 UTC m=+961.237430759" Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.547006 4735 generic.go:334] "Generic (PLEG): container finished" podID="19f45b80-8249-470b-b45c-ecca38477609" containerID="f24ed144ef04ba958cbee6c962d667d0f9ea158eb7e889438e1e365107c66e21" exitCode=0 Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.547056 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e0db-account-create-d2r8g" event={"ID":"19f45b80-8249-470b-b45c-ecca38477609","Type":"ContainerDied","Data":"f24ed144ef04ba958cbee6c962d667d0f9ea158eb7e889438e1e365107c66e21"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.548876 4735 generic.go:334] "Generic (PLEG): container finished" podID="c97b49f0-ca3a-4433-b3c8-549fef88bfc1" containerID="88fc78d2940c2fb975d246994510ce591ee999be68ce8220841a5eb38973e6fe" exitCode=0 Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.548915 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d6e0-account-create-2pwpg" event={"ID":"c97b49f0-ca3a-4433-b3c8-549fef88bfc1","Type":"ContainerDied","Data":"88fc78d2940c2fb975d246994510ce591ee999be68ce8220841a5eb38973e6fe"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.561776 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c4185de-864c-4329-a2b2-bc7164bb33c6","Type":"ContainerStarted","Data":"0d695d3e4c0af08383ebf43a9745bc57abd3371c82ea2546282bd7cbbcce367d"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.561820 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c4185de-864c-4329-a2b2-bc7164bb33c6","Type":"ContainerStarted","Data":"0f7a8b38e554a2525bda66d4fb06f3bedaae8ede449744a03ede59dbc2b3eabe"} Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.586097 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.586078325 podStartE2EDuration="5.586078325s" podCreationTimestamp="2025-10-01 10:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:22.570483388 +0000 UTC m=+961.263304650" watchObservedRunningTime="2025-10-01 10:33:22.586078325 +0000 UTC m=+961.278899587" Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.601554 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xqbbf" podStartSLOduration=10.601536738 podStartE2EDuration="10.601536738s" podCreationTimestamp="2025-10-01 10:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:22.595894327 +0000 UTC m=+961.288715589" watchObservedRunningTime="2025-10-01 10:33:22.601536738 +0000 UTC m=+961.294357990" Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.617431 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7746dbdbf6-t6f7n" podStartSLOduration=10.617415623 podStartE2EDuration="10.617415623s" podCreationTimestamp="2025-10-01 10:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:22.615834991 +0000 UTC m=+961.308656253" watchObservedRunningTime="2025-10-01 10:33:22.617415623 +0000 UTC m=+961.310236885" Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.665666 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c88c854b9-2q2b2" podStartSLOduration=8.153944501 podStartE2EDuration="21.665651282s" podCreationTimestamp="2025-10-01 10:33:01 +0000 UTC" firstStartedPulling="2025-10-01 10:33:02.85239248 +0000 UTC m=+941.545213742" lastFinishedPulling="2025-10-01 10:33:16.364099251 +0000 UTC m=+955.056920523" observedRunningTime="2025-10-01 10:33:22.647464856 +0000 UTC m=+961.340286118" watchObservedRunningTime="2025-10-01 10:33:22.665651282 +0000 UTC m=+961.358472544" Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.666676 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-687c8c8cfc-p8gqg" podStartSLOduration=7.564632857 podStartE2EDuration="21.66667181s" podCreationTimestamp="2025-10-01 10:33:01 +0000 UTC" firstStartedPulling="2025-10-01 10:33:02.586665996 +0000 UTC m=+941.279487258" lastFinishedPulling="2025-10-01 10:33:16.688704949 +0000 UTC m=+955.381526211" observedRunningTime="2025-10-01 10:33:22.664672827 +0000 UTC m=+961.357494089" watchObservedRunningTime="2025-10-01 10:33:22.66667181 +0000 UTC m=+961.359493072" Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.703466 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.703448453 podStartE2EDuration="5.703448453s" podCreationTimestamp="2025-10-01 10:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:22.689932692 +0000 UTC m=+961.382753954" watchObservedRunningTime="2025-10-01 10:33:22.703448453 +0000 UTC m=+961.396269715" Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.712121 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:22 crc kubenswrapper[4735]: I1001 10:33:22.712196 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:23.999620 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d6e0-account-create-2pwpg" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.016254 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e0db-account-create-d2r8g" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.031327 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4552-account-create-25f9d" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.043324 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qss77\" (UniqueName: \"kubernetes.io/projected/19f45b80-8249-470b-b45c-ecca38477609-kube-api-access-qss77\") pod \"19f45b80-8249-470b-b45c-ecca38477609\" (UID: \"19f45b80-8249-470b-b45c-ecca38477609\") " Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.043408 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vftj\" (UniqueName: \"kubernetes.io/projected/c97b49f0-ca3a-4433-b3c8-549fef88bfc1-kube-api-access-4vftj\") pod \"c97b49f0-ca3a-4433-b3c8-549fef88bfc1\" (UID: \"c97b49f0-ca3a-4433-b3c8-549fef88bfc1\") " Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.051231 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c97b49f0-ca3a-4433-b3c8-549fef88bfc1-kube-api-access-4vftj" (OuterVolumeSpecName: "kube-api-access-4vftj") pod "c97b49f0-ca3a-4433-b3c8-549fef88bfc1" (UID: "c97b49f0-ca3a-4433-b3c8-549fef88bfc1"). InnerVolumeSpecName "kube-api-access-4vftj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.052388 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f45b80-8249-470b-b45c-ecca38477609-kube-api-access-qss77" (OuterVolumeSpecName: "kube-api-access-qss77") pod "19f45b80-8249-470b-b45c-ecca38477609" (UID: "19f45b80-8249-470b-b45c-ecca38477609"). InnerVolumeSpecName "kube-api-access-qss77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.146058 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp5wg\" (UniqueName: \"kubernetes.io/projected/314d523b-a835-4be5-b964-108fbc40db3a-kube-api-access-hp5wg\") pod \"314d523b-a835-4be5-b964-108fbc40db3a\" (UID: \"314d523b-a835-4be5-b964-108fbc40db3a\") " Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.146573 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vftj\" (UniqueName: \"kubernetes.io/projected/c97b49f0-ca3a-4433-b3c8-549fef88bfc1-kube-api-access-4vftj\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.146590 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qss77\" (UniqueName: \"kubernetes.io/projected/19f45b80-8249-470b-b45c-ecca38477609-kube-api-access-qss77\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.149702 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314d523b-a835-4be5-b964-108fbc40db3a-kube-api-access-hp5wg" (OuterVolumeSpecName: "kube-api-access-hp5wg") pod "314d523b-a835-4be5-b964-108fbc40db3a" (UID: "314d523b-a835-4be5-b964-108fbc40db3a"). InnerVolumeSpecName "kube-api-access-hp5wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.248023 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp5wg\" (UniqueName: \"kubernetes.io/projected/314d523b-a835-4be5-b964-108fbc40db3a-kube-api-access-hp5wg\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.586449 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e0db-account-create-d2r8g" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.586455 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e0db-account-create-d2r8g" event={"ID":"19f45b80-8249-470b-b45c-ecca38477609","Type":"ContainerDied","Data":"34fbb51d8685bbd8b4eeb7048631e7eeb7cd5e7e8e766bd45bc9e6efc6bcbcde"} Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.586527 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34fbb51d8685bbd8b4eeb7048631e7eeb7cd5e7e8e766bd45bc9e6efc6bcbcde" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.592520 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea41140-1a15-4e64-89e5-d68b9208dff1","Type":"ContainerStarted","Data":"4154d64030a31e12e10c808e937f2a06aa24a479c594a933c0e20d271f3ac8a8"} Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.595187 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4552-account-create-25f9d" event={"ID":"314d523b-a835-4be5-b964-108fbc40db3a","Type":"ContainerDied","Data":"cc5240ee1e146582cd5993c30ad4d405b83b28214d6188c7e7d02e50d6919454"} Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.595287 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc5240ee1e146582cd5993c30ad4d405b83b28214d6188c7e7d02e50d6919454" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.595199 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4552-account-create-25f9d" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.600790 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d6e0-account-create-2pwpg" event={"ID":"c97b49f0-ca3a-4433-b3c8-549fef88bfc1","Type":"ContainerDied","Data":"09142c16df14ff1332156901b97cccc8a6e352ab8d15535f425d8c98231caaad"} Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.600847 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09142c16df14ff1332156901b97cccc8a6e352ab8d15535f425d8c98231caaad" Oct 01 10:33:24 crc kubenswrapper[4735]: I1001 10:33:24.600983 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d6e0-account-create-2pwpg" Oct 01 10:33:25 crc kubenswrapper[4735]: I1001 10:33:25.615812 4735 generic.go:334] "Generic (PLEG): container finished" podID="c176cb22-cf84-4a55-aac6-6fff6792ea56" containerID="19eea9f23b739e7a0e61f0315d3d8dc06408a21cadf84144922df57633f716f0" exitCode=0 Oct 01 10:33:25 crc kubenswrapper[4735]: I1001 10:33:25.615869 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqbbf" event={"ID":"c176cb22-cf84-4a55-aac6-6fff6792ea56","Type":"ContainerDied","Data":"19eea9f23b739e7a0e61f0315d3d8dc06408a21cadf84144922df57633f716f0"} Oct 01 10:33:25 crc kubenswrapper[4735]: I1001 10:33:25.789187 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:26 crc kubenswrapper[4735]: I1001 10:33:26.981728 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.020891 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-combined-ca-bundle\") pod \"c176cb22-cf84-4a55-aac6-6fff6792ea56\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.020976 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-fernet-keys\") pod \"c176cb22-cf84-4a55-aac6-6fff6792ea56\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.021023 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6x8t\" (UniqueName: \"kubernetes.io/projected/c176cb22-cf84-4a55-aac6-6fff6792ea56-kube-api-access-h6x8t\") pod \"c176cb22-cf84-4a55-aac6-6fff6792ea56\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.021069 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-credential-keys\") pod \"c176cb22-cf84-4a55-aac6-6fff6792ea56\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.021113 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-config-data\") pod \"c176cb22-cf84-4a55-aac6-6fff6792ea56\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.021194 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-scripts\") pod \"c176cb22-cf84-4a55-aac6-6fff6792ea56\" (UID: \"c176cb22-cf84-4a55-aac6-6fff6792ea56\") " Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.027597 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c176cb22-cf84-4a55-aac6-6fff6792ea56" (UID: "c176cb22-cf84-4a55-aac6-6fff6792ea56"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.035628 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c176cb22-cf84-4a55-aac6-6fff6792ea56" (UID: "c176cb22-cf84-4a55-aac6-6fff6792ea56"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.047374 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c176cb22-cf84-4a55-aac6-6fff6792ea56-kube-api-access-h6x8t" (OuterVolumeSpecName: "kube-api-access-h6x8t") pod "c176cb22-cf84-4a55-aac6-6fff6792ea56" (UID: "c176cb22-cf84-4a55-aac6-6fff6792ea56"). InnerVolumeSpecName "kube-api-access-h6x8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.047481 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-scripts" (OuterVolumeSpecName: "scripts") pod "c176cb22-cf84-4a55-aac6-6fff6792ea56" (UID: "c176cb22-cf84-4a55-aac6-6fff6792ea56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.048325 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c176cb22-cf84-4a55-aac6-6fff6792ea56" (UID: "c176cb22-cf84-4a55-aac6-6fff6792ea56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.062651 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-config-data" (OuterVolumeSpecName: "config-data") pod "c176cb22-cf84-4a55-aac6-6fff6792ea56" (UID: "c176cb22-cf84-4a55-aac6-6fff6792ea56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.122711 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.122743 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.122753 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.122762 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6x8t\" (UniqueName: \"kubernetes.io/projected/c176cb22-cf84-4a55-aac6-6fff6792ea56-kube-api-access-h6x8t\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.122771 4735 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.122781 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c176cb22-cf84-4a55-aac6-6fff6792ea56-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.126056 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-8cvdn"] Oct 01 10:33:27 crc kubenswrapper[4735]: E1001 10:33:27.126395 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f45b80-8249-470b-b45c-ecca38477609" containerName="mariadb-account-create" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.126410 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f45b80-8249-470b-b45c-ecca38477609" containerName="mariadb-account-create" Oct 01 10:33:27 crc kubenswrapper[4735]: E1001 10:33:27.126420 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97b49f0-ca3a-4433-b3c8-549fef88bfc1" containerName="mariadb-account-create" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.126426 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97b49f0-ca3a-4433-b3c8-549fef88bfc1" containerName="mariadb-account-create" Oct 01 10:33:27 crc kubenswrapper[4735]: E1001 10:33:27.126447 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314d523b-a835-4be5-b964-108fbc40db3a" containerName="mariadb-account-create" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.126454 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="314d523b-a835-4be5-b964-108fbc40db3a" containerName="mariadb-account-create" Oct 01 10:33:27 crc kubenswrapper[4735]: E1001 10:33:27.126466 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c176cb22-cf84-4a55-aac6-6fff6792ea56" containerName="keystone-bootstrap" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.126471 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c176cb22-cf84-4a55-aac6-6fff6792ea56" containerName="keystone-bootstrap" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.126641 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c97b49f0-ca3a-4433-b3c8-549fef88bfc1" containerName="mariadb-account-create" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.126658 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c176cb22-cf84-4a55-aac6-6fff6792ea56" containerName="keystone-bootstrap" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.126675 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f45b80-8249-470b-b45c-ecca38477609" containerName="mariadb-account-create" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.126684 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="314d523b-a835-4be5-b964-108fbc40db3a" containerName="mariadb-account-create" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.130750 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.133651 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dcf4d" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.133840 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.135281 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8cvdn"] Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.224359 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-combined-ca-bundle\") pod \"barbican-db-sync-8cvdn\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.224421 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg2vn\" (UniqueName: \"kubernetes.io/projected/541784dc-4146-459d-bee0-2f97d22a7977-kube-api-access-rg2vn\") pod \"barbican-db-sync-8cvdn\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.224520 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-db-sync-config-data\") pod \"barbican-db-sync-8cvdn\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.237043 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-84vvz"] Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.238020 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.244723 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.246003 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xbgnd" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.251069 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.256698 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-84vvz"] Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.326576 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-db-sync-config-data\") pod \"barbican-db-sync-8cvdn\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.326716 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-scripts\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.326746 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-etc-machine-id\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.326782 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-combined-ca-bundle\") pod \"barbican-db-sync-8cvdn\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.326898 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg2vn\" (UniqueName: \"kubernetes.io/projected/541784dc-4146-459d-bee0-2f97d22a7977-kube-api-access-rg2vn\") pod \"barbican-db-sync-8cvdn\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.326953 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbrm9\" (UniqueName: \"kubernetes.io/projected/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-kube-api-access-mbrm9\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.327016 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-db-sync-config-data\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.327043 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-config-data\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.327145 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-combined-ca-bundle\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.332121 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-db-sync-config-data\") pod \"barbican-db-sync-8cvdn\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.332285 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-combined-ca-bundle\") pod \"barbican-db-sync-8cvdn\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.342235 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg2vn\" (UniqueName: \"kubernetes.io/projected/541784dc-4146-459d-bee0-2f97d22a7977-kube-api-access-rg2vn\") pod \"barbican-db-sync-8cvdn\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.428747 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-db-sync-config-data\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.428826 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-config-data\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.428887 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-combined-ca-bundle\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.429094 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-scripts\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.429168 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-etc-machine-id\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.429215 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-etc-machine-id\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.429304 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbrm9\" (UniqueName: \"kubernetes.io/projected/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-kube-api-access-mbrm9\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.432613 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-scripts\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.432617 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-combined-ca-bundle\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.432918 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-db-sync-config-data\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.433384 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-config-data\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.444807 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbrm9\" (UniqueName: \"kubernetes.io/projected/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-kube-api-access-mbrm9\") pod \"cinder-db-sync-84vvz\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.449422 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.555912 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-84vvz" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.652563 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-grznc"] Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.653821 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.656091 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.656751 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mp5h7" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.656787 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.662040 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-grznc"] Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.672364 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xqbbf" event={"ID":"c176cb22-cf84-4a55-aac6-6fff6792ea56","Type":"ContainerDied","Data":"4fc78d1807c99c49a2d5bbb7f70b026cc2c53bc6af98a30b7a4ab408202b2e2e"} Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.672559 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fc78d1807c99c49a2d5bbb7f70b026cc2c53bc6af98a30b7a4ab408202b2e2e" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.672654 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xqbbf" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.745101 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-config\") pod \"neutron-db-sync-grznc\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.752664 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t45v\" (UniqueName: \"kubernetes.io/projected/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-kube-api-access-6t45v\") pod \"neutron-db-sync-grznc\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.753053 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-combined-ca-bundle\") pod \"neutron-db-sync-grznc\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.797053 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7888d7549b-dbpkr"] Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.798546 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.803862 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.804124 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.804332 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.805913 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pn2sw" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.806132 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.808588 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.820460 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7888d7549b-dbpkr"] Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.854222 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-combined-ca-bundle\") pod \"neutron-db-sync-grznc\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.854549 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-fernet-keys\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.854572 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-config-data\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.857000 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-config\") pod \"neutron-db-sync-grznc\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.857054 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-credential-keys\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.857178 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-scripts\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.857227 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d29z\" (UniqueName: \"kubernetes.io/projected/98103d81-4a3c-4c99-9d51-f73f8e5fd295-kube-api-access-2d29z\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.857329 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-internal-tls-certs\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.857380 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-public-tls-certs\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.857399 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-combined-ca-bundle\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.857457 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t45v\" (UniqueName: \"kubernetes.io/projected/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-kube-api-access-6t45v\") pod \"neutron-db-sync-grznc\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.864986 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-combined-ca-bundle\") pod \"neutron-db-sync-grznc\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.869087 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-config\") pod \"neutron-db-sync-grznc\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.885905 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t45v\" (UniqueName: \"kubernetes.io/projected/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-kube-api-access-6t45v\") pod \"neutron-db-sync-grznc\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.936071 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8cvdn"] Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.936108 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.936119 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.936128 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.936138 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.958763 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-fernet-keys\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.958811 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-config-data\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.958860 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-credential-keys\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.958904 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-scripts\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.958932 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d29z\" (UniqueName: \"kubernetes.io/projected/98103d81-4a3c-4c99-9d51-f73f8e5fd295-kube-api-access-2d29z\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.958985 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-internal-tls-certs\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.959021 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-public-tls-certs\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.959057 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-combined-ca-bundle\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.969605 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-public-tls-certs\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.973287 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-config-data\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.974144 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.980672 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-internal-tls-certs\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.981276 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-scripts\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.981432 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-fernet-keys\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.984663 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d29z\" (UniqueName: \"kubernetes.io/projected/98103d81-4a3c-4c99-9d51-f73f8e5fd295-kube-api-access-2d29z\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:27 crc kubenswrapper[4735]: I1001 10:33:27.987332 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-combined-ca-bundle\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:27.992745 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:27.993205 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/98103d81-4a3c-4c99-9d51-f73f8e5fd295-credential-keys\") pod \"keystone-7888d7549b-dbpkr\" (UID: \"98103d81-4a3c-4c99-9d51-f73f8e5fd295\") " pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.012289 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.023367 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.023581 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.134034 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.237547 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-84vvz"] Oct 01 10:33:28 crc kubenswrapper[4735]: W1001 10:33:28.291439 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61dd2f37_7f60_42f5_a3d0_3b693d1e64be.slice/crio-79bb3c3ebe3f4ba53348cf93b6beeaec50fe3ff1506f9799886e91f5ccab8fe7 WatchSource:0}: Error finding container 79bb3c3ebe3f4ba53348cf93b6beeaec50fe3ff1506f9799886e91f5ccab8fe7: Status 404 returned error can't find the container with id 79bb3c3ebe3f4ba53348cf93b6beeaec50fe3ff1506f9799886e91f5ccab8fe7 Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.511986 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7888d7549b-dbpkr"] Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.533865 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-grznc"] Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.685059 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7888d7549b-dbpkr" event={"ID":"98103d81-4a3c-4c99-9d51-f73f8e5fd295","Type":"ContainerStarted","Data":"5c100b52f4b7755f1e3dd2fe34e99940780648c93c9fe32a3158e000d061fac0"} Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.687232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-grznc" event={"ID":"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6","Type":"ContainerStarted","Data":"7e8cdb021bc7826e5b9706cf1bdc804b4520ce1564744aa3799b7f88c4e78f19"} Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.689043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-84vvz" event={"ID":"61dd2f37-7f60-42f5-a3d0-3b693d1e64be","Type":"ContainerStarted","Data":"79bb3c3ebe3f4ba53348cf93b6beeaec50fe3ff1506f9799886e91f5ccab8fe7"} Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.690046 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8cvdn" event={"ID":"541784dc-4146-459d-bee0-2f97d22a7977","Type":"ContainerStarted","Data":"8106cc853f9d1653f43b0d0e356c1e0c215d1019ae59972d6aba6a1c686af251"} Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.690975 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.690997 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.691052 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:28 crc kubenswrapper[4735]: I1001 10:33:28.691061 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 10:33:29 crc kubenswrapper[4735]: I1001 10:33:29.697881 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7888d7549b-dbpkr" event={"ID":"98103d81-4a3c-4c99-9d51-f73f8e5fd295","Type":"ContainerStarted","Data":"285cb846743c7a24c9ed2bdc17d36d5bf2a31d9903ee190b67faa1ed0cb9e94d"} Oct 01 10:33:29 crc kubenswrapper[4735]: I1001 10:33:29.698200 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:33:29 crc kubenswrapper[4735]: I1001 10:33:29.699845 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-t8d7j" event={"ID":"c7cb175b-c967-47c2-96b0-da043b6d3506","Type":"ContainerStarted","Data":"871ba6435ca82ef3378113758f71d32f1bf5a06ff46ed614a610e83dca5a4e19"} Oct 01 10:33:29 crc kubenswrapper[4735]: I1001 10:33:29.701466 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-grznc" event={"ID":"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6","Type":"ContainerStarted","Data":"9e576938ab80f23960b56223b285b3cd0eb4e29059a1804a453a3467282b53cf"} Oct 01 10:33:29 crc kubenswrapper[4735]: I1001 10:33:29.716991 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7888d7549b-dbpkr" podStartSLOduration=2.716975682 podStartE2EDuration="2.716975682s" podCreationTimestamp="2025-10-01 10:33:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:29.715278477 +0000 UTC m=+968.408099739" watchObservedRunningTime="2025-10-01 10:33:29.716975682 +0000 UTC m=+968.409796944" Oct 01 10:33:29 crc kubenswrapper[4735]: I1001 10:33:29.733364 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-t8d7j" podStartSLOduration=2.998038544 podStartE2EDuration="28.73334818s" podCreationTimestamp="2025-10-01 10:33:01 +0000 UTC" firstStartedPulling="2025-10-01 10:33:02.900550239 +0000 UTC m=+941.593371501" lastFinishedPulling="2025-10-01 10:33:28.635859875 +0000 UTC m=+967.328681137" observedRunningTime="2025-10-01 10:33:29.728299615 +0000 UTC m=+968.421120877" watchObservedRunningTime="2025-10-01 10:33:29.73334818 +0000 UTC m=+968.426169442" Oct 01 10:33:29 crc kubenswrapper[4735]: I1001 10:33:29.742143 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-grznc" podStartSLOduration=2.7421300349999997 podStartE2EDuration="2.742130035s" podCreationTimestamp="2025-10-01 10:33:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:29.738889279 +0000 UTC m=+968.431710541" watchObservedRunningTime="2025-10-01 10:33:29.742130035 +0000 UTC m=+968.434951297" Oct 01 10:33:30 crc kubenswrapper[4735]: I1001 10:33:30.712043 4735 generic.go:334] "Generic (PLEG): container finished" podID="c7cb175b-c967-47c2-96b0-da043b6d3506" containerID="871ba6435ca82ef3378113758f71d32f1bf5a06ff46ed614a610e83dca5a4e19" exitCode=0 Oct 01 10:33:30 crc kubenswrapper[4735]: I1001 10:33:30.712302 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 10:33:30 crc kubenswrapper[4735]: I1001 10:33:30.712333 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 10:33:30 crc kubenswrapper[4735]: I1001 10:33:30.712785 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-t8d7j" event={"ID":"c7cb175b-c967-47c2-96b0-da043b6d3506","Type":"ContainerDied","Data":"871ba6435ca82ef3378113758f71d32f1bf5a06ff46ed614a610e83dca5a4e19"} Oct 01 10:33:30 crc kubenswrapper[4735]: I1001 10:33:30.742253 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:30 crc kubenswrapper[4735]: I1001 10:33:30.742323 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 10:33:30 crc kubenswrapper[4735]: I1001 10:33:30.780354 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 10:33:31 crc kubenswrapper[4735]: I1001 10:33:31.162149 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 10:33:31 crc kubenswrapper[4735]: I1001 10:33:31.884482 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.085713 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.141375 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-config-data\") pod \"c7cb175b-c967-47c2-96b0-da043b6d3506\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.141697 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-scripts\") pod \"c7cb175b-c967-47c2-96b0-da043b6d3506\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.141836 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7cb175b-c967-47c2-96b0-da043b6d3506-logs\") pod \"c7cb175b-c967-47c2-96b0-da043b6d3506\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.141858 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fxds\" (UniqueName: \"kubernetes.io/projected/c7cb175b-c967-47c2-96b0-da043b6d3506-kube-api-access-4fxds\") pod \"c7cb175b-c967-47c2-96b0-da043b6d3506\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.141892 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-combined-ca-bundle\") pod \"c7cb175b-c967-47c2-96b0-da043b6d3506\" (UID: \"c7cb175b-c967-47c2-96b0-da043b6d3506\") " Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.143360 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7cb175b-c967-47c2-96b0-da043b6d3506-logs" (OuterVolumeSpecName: "logs") pod "c7cb175b-c967-47c2-96b0-da043b6d3506" (UID: "c7cb175b-c967-47c2-96b0-da043b6d3506"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.148643 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-scripts" (OuterVolumeSpecName: "scripts") pod "c7cb175b-c967-47c2-96b0-da043b6d3506" (UID: "c7cb175b-c967-47c2-96b0-da043b6d3506"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.153732 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7cb175b-c967-47c2-96b0-da043b6d3506-kube-api-access-4fxds" (OuterVolumeSpecName: "kube-api-access-4fxds") pod "c7cb175b-c967-47c2-96b0-da043b6d3506" (UID: "c7cb175b-c967-47c2-96b0-da043b6d3506"). InnerVolumeSpecName "kube-api-access-4fxds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.169563 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-config-data" (OuterVolumeSpecName: "config-data") pod "c7cb175b-c967-47c2-96b0-da043b6d3506" (UID: "c7cb175b-c967-47c2-96b0-da043b6d3506"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.172008 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7cb175b-c967-47c2-96b0-da043b6d3506" (UID: "c7cb175b-c967-47c2-96b0-da043b6d3506"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.198558 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.243654 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7cb175b-c967-47c2-96b0-da043b6d3506-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.243685 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fxds\" (UniqueName: \"kubernetes.io/projected/c7cb175b-c967-47c2-96b0-da043b6d3506-kube-api-access-4fxds\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.243699 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.243707 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.243717 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7cb175b-c967-47c2-96b0-da043b6d3506-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.545792 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7bcff764fb-c7nmm" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.710032 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7746dbdbf6-t6f7n" podUID="7353c4ca-59bc-4a50-8840-8365f90f6384" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.728852 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-t8d7j" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.728844 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-t8d7j" event={"ID":"c7cb175b-c967-47c2-96b0-da043b6d3506","Type":"ContainerDied","Data":"76c946918ca0b6e73121c36c73127c91f19bd2e50d94618b1316961cd197bf8f"} Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.728893 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76c946918ca0b6e73121c36c73127c91f19bd2e50d94618b1316961cd197bf8f" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.823830 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-85c495dfd8-k8nc5"] Oct 01 10:33:32 crc kubenswrapper[4735]: E1001 10:33:32.824401 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cb175b-c967-47c2-96b0-da043b6d3506" containerName="placement-db-sync" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.824465 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cb175b-c967-47c2-96b0-da043b6d3506" containerName="placement-db-sync" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.824758 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cb175b-c967-47c2-96b0-da043b6d3506" containerName="placement-db-sync" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.825676 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.828254 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.828548 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.828659 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.828728 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.828797 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-c8srt" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.839817 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-85c495dfd8-k8nc5"] Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.963873 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5542515c-c850-46f5-875b-65c55c28cbdc-logs\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.963933 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-public-tls-certs\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.964292 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-combined-ca-bundle\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.964348 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-internal-tls-certs\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.964659 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-scripts\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.964734 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnhc9\" (UniqueName: \"kubernetes.io/projected/5542515c-c850-46f5-875b-65c55c28cbdc-kube-api-access-xnhc9\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:32 crc kubenswrapper[4735]: I1001 10:33:32.964779 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-config-data\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.067032 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-scripts\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.067096 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnhc9\" (UniqueName: \"kubernetes.io/projected/5542515c-c850-46f5-875b-65c55c28cbdc-kube-api-access-xnhc9\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.067129 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-config-data\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.067172 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5542515c-c850-46f5-875b-65c55c28cbdc-logs\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.067195 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-public-tls-certs\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.067267 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-combined-ca-bundle\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.067295 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-internal-tls-certs\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.068049 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5542515c-c850-46f5-875b-65c55c28cbdc-logs\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.071075 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-internal-tls-certs\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.072289 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-public-tls-certs\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.073058 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-config-data\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.073868 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-scripts\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.080589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5542515c-c850-46f5-875b-65c55c28cbdc-combined-ca-bundle\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.083183 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnhc9\" (UniqueName: \"kubernetes.io/projected/5542515c-c850-46f5-875b-65c55c28cbdc-kube-api-access-xnhc9\") pod \"placement-85c495dfd8-k8nc5\" (UID: \"5542515c-c850-46f5-875b-65c55c28cbdc\") " pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.146744 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.600602 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-85c495dfd8-k8nc5"] Oct 01 10:33:33 crc kubenswrapper[4735]: I1001 10:33:33.736386 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85c495dfd8-k8nc5" event={"ID":"5542515c-c850-46f5-875b-65c55c28cbdc","Type":"ContainerStarted","Data":"8579a39640692a03b59d97bd34056f658599cc6134bd85d4f4e6212d01efb391"} Oct 01 10:33:34 crc kubenswrapper[4735]: E1001 10:33:34.158173 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/openstack-k8s-operators/sg-core:latest: can't talk to a V1 container registry" image="quay.io/openstack-k8s-operators/sg-core:latest" Oct 01 10:33:34 crc kubenswrapper[4735]: E1001 10:33:34.158295 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xtnvx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7ea41140-1a15-4e64-89e5-d68b9208dff1): ErrImagePull: initializing source docker://quay.io/openstack-k8s-operators/sg-core:latest: can't talk to a V1 container registry" logger="UnhandledError" Oct 01 10:33:34 crc kubenswrapper[4735]: I1001 10:33:34.749147 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85c495dfd8-k8nc5" event={"ID":"5542515c-c850-46f5-875b-65c55c28cbdc","Type":"ContainerStarted","Data":"16e4e57f82ce37ffd74581426a0c0f8c3f76f89943ed8f13390955551a612fdc"} Oct 01 10:33:34 crc kubenswrapper[4735]: I1001 10:33:34.749609 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85c495dfd8-k8nc5" event={"ID":"5542515c-c850-46f5-875b-65c55c28cbdc","Type":"ContainerStarted","Data":"c62ad960f16a0bc69d0204143803240788ec521e4920df0aa23bc2a5b33044b3"} Oct 01 10:33:34 crc kubenswrapper[4735]: I1001 10:33:34.749790 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:34 crc kubenswrapper[4735]: I1001 10:33:34.749815 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:33:34 crc kubenswrapper[4735]: I1001 10:33:34.779561 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-85c495dfd8-k8nc5" podStartSLOduration=2.779544087 podStartE2EDuration="2.779544087s" podCreationTimestamp="2025-10-01 10:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:34.776890816 +0000 UTC m=+973.469712088" watchObservedRunningTime="2025-10-01 10:33:34.779544087 +0000 UTC m=+973.472365349" Oct 01 10:33:35 crc kubenswrapper[4735]: I1001 10:33:35.486417 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:33:35 crc kubenswrapper[4735]: I1001 10:33:35.486532 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:33:38 crc kubenswrapper[4735]: E1001 10:33:38.320534 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 01 10:33:38 crc kubenswrapper[4735]: E1001 10:33:38.320985 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rg2vn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-8cvdn_openstack(541784dc-4146-459d-bee0-2f97d22a7977): ErrImagePull: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" logger="UnhandledError" Oct 01 10:33:38 crc kubenswrapper[4735]: E1001 10:33:38.322178 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out\"" pod="openstack/barbican-db-sync-8cvdn" podUID="541784dc-4146-459d-bee0-2f97d22a7977" Oct 01 10:33:38 crc kubenswrapper[4735]: E1001 10:33:38.589231 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified: reading manifest current-podified in quay.io/podified-antelope-centos9/openstack-cinder-api: received unexpected HTTP status: 504 Gateway Time-out" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 01 10:33:38 crc kubenswrapper[4735]: E1001 10:33:38.589461 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbrm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-84vvz_openstack(61dd2f37-7f60-42f5-a3d0-3b693d1e64be): ErrImagePull: initializing source docker://quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified: reading manifest current-podified in quay.io/podified-antelope-centos9/openstack-cinder-api: received unexpected HTTP status: 504 Gateway Time-out" logger="UnhandledError" Oct 01 10:33:38 crc kubenswrapper[4735]: E1001 10:33:38.590724 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"initializing source docker://quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified: reading manifest current-podified in quay.io/podified-antelope-centos9/openstack-cinder-api: received unexpected HTTP status: 504 Gateway Time-out\"" pod="openstack/cinder-db-sync-84vvz" podUID="61dd2f37-7f60-42f5-a3d0-3b693d1e64be" Oct 01 10:33:38 crc kubenswrapper[4735]: E1001 10:33:38.793317 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-84vvz" podUID="61dd2f37-7f60-42f5-a3d0-3b693d1e64be" Oct 01 10:33:38 crc kubenswrapper[4735]: E1001 10:33:38.795453 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-8cvdn" podUID="541784dc-4146-459d-bee0-2f97d22a7977" Oct 01 10:33:42 crc kubenswrapper[4735]: I1001 10:33:42.850435 4735 generic.go:334] "Generic (PLEG): container finished" podID="e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6" containerID="9e576938ab80f23960b56223b285b3cd0eb4e29059a1804a453a3467282b53cf" exitCode=0 Oct 01 10:33:42 crc kubenswrapper[4735]: I1001 10:33:42.850562 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-grznc" event={"ID":"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6","Type":"ContainerDied","Data":"9e576938ab80f23960b56223b285b3cd0eb4e29059a1804a453a3467282b53cf"} Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.206161 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.290013 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.311325 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t45v\" (UniqueName: \"kubernetes.io/projected/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-kube-api-access-6t45v\") pod \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.311423 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-combined-ca-bundle\") pod \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.311598 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-config\") pod \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\" (UID: \"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6\") " Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.320872 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-kube-api-access-6t45v" (OuterVolumeSpecName: "kube-api-access-6t45v") pod "e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6" (UID: "e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6"). InnerVolumeSpecName "kube-api-access-6t45v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.345224 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6" (UID: "e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.346045 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-config" (OuterVolumeSpecName: "config") pod "e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6" (UID: "e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:44 crc kubenswrapper[4735]: E1001 10:33:44.355824 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/ubi9/httpd-24:latest: reading manifest latest in registry.redhat.io/ubi9/httpd-24: received unexpected HTTP status: 504 Gateway Timeout" image="registry.redhat.io/ubi9/httpd-24:latest" Oct 01 10:33:44 crc kubenswrapper[4735]: E1001 10:33:44.356023 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xtnvx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7ea41140-1a15-4e64-89e5-d68b9208dff1): ErrImagePull: initializing source docker://registry.redhat.io/ubi9/httpd-24:latest: reading manifest latest in registry.redhat.io/ubi9/httpd-24: received unexpected HTTP status: 504 Gateway Timeout" logger="UnhandledError" Oct 01 10:33:44 crc kubenswrapper[4735]: E1001 10:33:44.357248 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"initializing source docker://quay.io/openstack-k8s-operators/sg-core:latest: can't talk to a V1 container registry\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"initializing source docker://registry.redhat.io/ubi9/httpd-24:latest: reading manifest latest in registry.redhat.io/ubi9/httpd-24: received unexpected HTTP status: 504 Gateway Timeout\"]" pod="openstack/ceilometer-0" podUID="7ea41140-1a15-4e64-89e5-d68b9208dff1" Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.414738 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t45v\" (UniqueName: \"kubernetes.io/projected/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-kube-api-access-6t45v\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.414778 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.414791 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.481575 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.877888 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-grznc" Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.878238 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ea41140-1a15-4e64-89e5-d68b9208dff1" containerName="ceilometer-notification-agent" containerID="cri-o://4154d64030a31e12e10c808e937f2a06aa24a479c594a933c0e20d271f3ac8a8" gracePeriod=30 Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.878031 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ea41140-1a15-4e64-89e5-d68b9208dff1" containerName="ceilometer-central-agent" containerID="cri-o://8e11edbb3a51634f53489e50353ab8eca5b4e68b02bec161bef6c00e771954b1" gracePeriod=30 Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.878105 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-grznc" event={"ID":"e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6","Type":"ContainerDied","Data":"7e8cdb021bc7826e5b9706cf1bdc804b4520ce1564744aa3799b7f88c4e78f19"} Oct 01 10:33:44 crc kubenswrapper[4735]: I1001 10:33:44.879004 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e8cdb021bc7826e5b9706cf1bdc804b4520ce1564744aa3799b7f88c4e78f19" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.109851 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-g2kcn"] Oct 01 10:33:45 crc kubenswrapper[4735]: E1001 10:33:45.112418 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6" containerName="neutron-db-sync" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.112440 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6" containerName="neutron-db-sync" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.113888 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6" containerName="neutron-db-sync" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.114822 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.126422 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-g2kcn"] Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.127642 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.127790 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7qbh\" (UniqueName: \"kubernetes.io/projected/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-kube-api-access-r7qbh\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.127913 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.128035 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-config\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.128131 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.128237 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.210981 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-94689896b-kwvtc"] Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.212383 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.215648 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mp5h7" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.215850 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.216750 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.217084 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.222884 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-94689896b-kwvtc"] Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.230106 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-combined-ca-bundle\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.230158 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.230216 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-config\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.230266 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.230283 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-ovndb-tls-certs\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.230313 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dwvk\" (UniqueName: \"kubernetes.io/projected/5f2b4d4c-d741-4a88-b71c-fda5946d3896-kube-api-access-9dwvk\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.230340 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.230370 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.230404 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-config\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.230423 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-httpd-config\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.230451 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7qbh\" (UniqueName: \"kubernetes.io/projected/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-kube-api-access-r7qbh\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.231736 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.232289 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-config\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.233906 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.234562 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.235077 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.250433 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7qbh\" (UniqueName: \"kubernetes.io/projected/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-kube-api-access-r7qbh\") pod \"dnsmasq-dns-84b966f6c9-g2kcn\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.332637 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-ovndb-tls-certs\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.332835 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dwvk\" (UniqueName: \"kubernetes.io/projected/5f2b4d4c-d741-4a88-b71c-fda5946d3896-kube-api-access-9dwvk\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.332925 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-config\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.332949 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-httpd-config\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.332973 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-combined-ca-bundle\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.338002 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-config\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.338725 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-ovndb-tls-certs\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.339094 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-combined-ca-bundle\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.345193 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-httpd-config\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.355868 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dwvk\" (UniqueName: \"kubernetes.io/projected/5f2b4d4c-d741-4a88-b71c-fda5946d3896-kube-api-access-9dwvk\") pod \"neutron-94689896b-kwvtc\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.449214 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.532116 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.751351 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-g2kcn"] Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.890487 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" event={"ID":"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e","Type":"ContainerStarted","Data":"5144aa29f5ee863ea50838f1bd82ee74c373730beb94051d113bd4ec53f5e999"} Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.892680 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ea41140-1a15-4e64-89e5-d68b9208dff1" containerID="4154d64030a31e12e10c808e937f2a06aa24a479c594a933c0e20d271f3ac8a8" exitCode=0 Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.892701 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ea41140-1a15-4e64-89e5-d68b9208dff1" containerID="8e11edbb3a51634f53489e50353ab8eca5b4e68b02bec161bef6c00e771954b1" exitCode=0 Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.892714 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea41140-1a15-4e64-89e5-d68b9208dff1","Type":"ContainerDied","Data":"4154d64030a31e12e10c808e937f2a06aa24a479c594a933c0e20d271f3ac8a8"} Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.892728 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea41140-1a15-4e64-89e5-d68b9208dff1","Type":"ContainerDied","Data":"8e11edbb3a51634f53489e50353ab8eca5b4e68b02bec161bef6c00e771954b1"} Oct 01 10:33:45 crc kubenswrapper[4735]: I1001 10:33:45.991454 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.056397 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.068940 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-config-data\") pod \"7ea41140-1a15-4e64-89e5-d68b9208dff1\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.069014 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-sg-core-conf-yaml\") pod \"7ea41140-1a15-4e64-89e5-d68b9208dff1\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.069058 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtnvx\" (UniqueName: \"kubernetes.io/projected/7ea41140-1a15-4e64-89e5-d68b9208dff1-kube-api-access-xtnvx\") pod \"7ea41140-1a15-4e64-89e5-d68b9208dff1\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.069140 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-combined-ca-bundle\") pod \"7ea41140-1a15-4e64-89e5-d68b9208dff1\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.069199 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-run-httpd\") pod \"7ea41140-1a15-4e64-89e5-d68b9208dff1\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.069280 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-scripts\") pod \"7ea41140-1a15-4e64-89e5-d68b9208dff1\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.069303 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-log-httpd\") pod \"7ea41140-1a15-4e64-89e5-d68b9208dff1\" (UID: \"7ea41140-1a15-4e64-89e5-d68b9208dff1\") " Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.071327 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7ea41140-1a15-4e64-89e5-d68b9208dff1" (UID: "7ea41140-1a15-4e64-89e5-d68b9208dff1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.072526 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7ea41140-1a15-4e64-89e5-d68b9208dff1" (UID: "7ea41140-1a15-4e64-89e5-d68b9208dff1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.081007 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-scripts" (OuterVolumeSpecName: "scripts") pod "7ea41140-1a15-4e64-89e5-d68b9208dff1" (UID: "7ea41140-1a15-4e64-89e5-d68b9208dff1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.082556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7ea41140-1a15-4e64-89e5-d68b9208dff1" (UID: "7ea41140-1a15-4e64-89e5-d68b9208dff1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.086174 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea41140-1a15-4e64-89e5-d68b9208dff1-kube-api-access-xtnvx" (OuterVolumeSpecName: "kube-api-access-xtnvx") pod "7ea41140-1a15-4e64-89e5-d68b9208dff1" (UID: "7ea41140-1a15-4e64-89e5-d68b9208dff1"). InnerVolumeSpecName "kube-api-access-xtnvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.124334 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-config-data" (OuterVolumeSpecName: "config-data") pod "7ea41140-1a15-4e64-89e5-d68b9208dff1" (UID: "7ea41140-1a15-4e64-89e5-d68b9208dff1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.126101 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ea41140-1a15-4e64-89e5-d68b9208dff1" (UID: "7ea41140-1a15-4e64-89e5-d68b9208dff1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.172562 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.172593 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.172602 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.172610 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ea41140-1a15-4e64-89e5-d68b9208dff1-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.172619 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.172627 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ea41140-1a15-4e64-89e5-d68b9208dff1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.172635 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtnvx\" (UniqueName: \"kubernetes.io/projected/7ea41140-1a15-4e64-89e5-d68b9208dff1-kube-api-access-xtnvx\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.219336 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7746dbdbf6-t6f7n" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.264448 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bcff764fb-c7nmm"] Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.471240 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-94689896b-kwvtc"] Oct 01 10:33:46 crc kubenswrapper[4735]: W1001 10:33:46.477857 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f2b4d4c_d741_4a88_b71c_fda5946d3896.slice/crio-6a0bd98ea2644432043b0e5f08007f1b0d5005801036a0d0e70b112faf6209eb WatchSource:0}: Error finding container 6a0bd98ea2644432043b0e5f08007f1b0d5005801036a0d0e70b112faf6209eb: Status 404 returned error can't find the container with id 6a0bd98ea2644432043b0e5f08007f1b0d5005801036a0d0e70b112faf6209eb Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.904797 4735 generic.go:334] "Generic (PLEG): container finished" podID="fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" containerID="38a005e749c834ab5d772a75cf1f153fadc4bd76075efd9d5da299f369be7096" exitCode=0 Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.905001 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" event={"ID":"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e","Type":"ContainerDied","Data":"38a005e749c834ab5d772a75cf1f153fadc4bd76075efd9d5da299f369be7096"} Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.911554 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.911573 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ea41140-1a15-4e64-89e5-d68b9208dff1","Type":"ContainerDied","Data":"e6feb6966c4d38b0eb4e8a892ac550439b1569b1d50851296b4e8560a4a6db55"} Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.911621 4735 scope.go:117] "RemoveContainer" containerID="4154d64030a31e12e10c808e937f2a06aa24a479c594a933c0e20d271f3ac8a8" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.914451 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bcff764fb-c7nmm" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon-log" containerID="cri-o://3c26fba4b03c5d2ead52a8171b3d00ed23efc77ca18ab4fe450fd043e2e25cb4" gracePeriod=30 Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.915355 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94689896b-kwvtc" event={"ID":"5f2b4d4c-d741-4a88-b71c-fda5946d3896","Type":"ContainerStarted","Data":"efb380d7acdfb188019f59c7ffb0a953d8a3325c7e5425bdcf6f172f779bd4ad"} Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.915395 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.915412 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94689896b-kwvtc" event={"ID":"5f2b4d4c-d741-4a88-b71c-fda5946d3896","Type":"ContainerStarted","Data":"4c2bb6ac267ef9269e11d07112b91da398af5bddd4295e35e8730117353b16f4"} Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.915426 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94689896b-kwvtc" event={"ID":"5f2b4d4c-d741-4a88-b71c-fda5946d3896","Type":"ContainerStarted","Data":"6a0bd98ea2644432043b0e5f08007f1b0d5005801036a0d0e70b112faf6209eb"} Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.915473 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bcff764fb-c7nmm" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon" containerID="cri-o://7a05d0208176e16ca827de0751205940fe05ef3bd18c2ddf1872862f95be80ef" gracePeriod=30 Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.948867 4735 scope.go:117] "RemoveContainer" containerID="8e11edbb3a51634f53489e50353ab8eca5b4e68b02bec161bef6c00e771954b1" Oct 01 10:33:46 crc kubenswrapper[4735]: I1001 10:33:46.974287 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-94689896b-kwvtc" podStartSLOduration=1.974266232 podStartE2EDuration="1.974266232s" podCreationTimestamp="2025-10-01 10:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:46.966887574 +0000 UTC m=+985.659708846" watchObservedRunningTime="2025-10-01 10:33:46.974266232 +0000 UTC m=+985.667087494" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.012755 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.027789 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.051390 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:33:47 crc kubenswrapper[4735]: E1001 10:33:47.052547 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea41140-1a15-4e64-89e5-d68b9208dff1" containerName="ceilometer-notification-agent" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.052571 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea41140-1a15-4e64-89e5-d68b9208dff1" containerName="ceilometer-notification-agent" Oct 01 10:33:47 crc kubenswrapper[4735]: E1001 10:33:47.052633 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea41140-1a15-4e64-89e5-d68b9208dff1" containerName="ceilometer-central-agent" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.052640 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea41140-1a15-4e64-89e5-d68b9208dff1" containerName="ceilometer-central-agent" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.052869 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea41140-1a15-4e64-89e5-d68b9208dff1" containerName="ceilometer-central-agent" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.052918 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea41140-1a15-4e64-89e5-d68b9208dff1" containerName="ceilometer-notification-agent" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.054540 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.062373 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.064404 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.064614 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.088685 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbxhx\" (UniqueName: \"kubernetes.io/projected/8506c575-4d5d-4691-9ebd-75d8c878f9a9-kube-api-access-bbxhx\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.088786 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-log-httpd\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.088805 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.088822 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-run-httpd\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.088867 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.088897 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-scripts\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.088918 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-config-data\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.189601 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-config-data\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.189648 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbxhx\" (UniqueName: \"kubernetes.io/projected/8506c575-4d5d-4691-9ebd-75d8c878f9a9-kube-api-access-bbxhx\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.189723 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-log-httpd\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.189739 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.189756 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-run-httpd\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.189797 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.189827 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-scripts\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.191039 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-log-httpd\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.191152 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-run-httpd\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.194218 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.194445 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-scripts\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.195056 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-config-data\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.195104 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.206215 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbxhx\" (UniqueName: \"kubernetes.io/projected/8506c575-4d5d-4691-9ebd-75d8c878f9a9-kube-api-access-bbxhx\") pod \"ceilometer-0\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.405352 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.694109 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-798b4f9b87-frx5r"] Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.695942 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.697575 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-ovndb-tls-certs\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.697625 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-config\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.697680 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-httpd-config\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.697717 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgjg8\" (UniqueName: \"kubernetes.io/projected/d235163d-548f-40e3-9aae-490a41523da2-kube-api-access-cgjg8\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.697761 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-public-tls-certs\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.697780 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-combined-ca-bundle\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.697840 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-internal-tls-certs\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.702193 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.705133 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.712306 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-798b4f9b87-frx5r"] Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.798861 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-httpd-config\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.798932 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgjg8\" (UniqueName: \"kubernetes.io/projected/d235163d-548f-40e3-9aae-490a41523da2-kube-api-access-cgjg8\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.798967 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-public-tls-certs\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.798986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-combined-ca-bundle\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.799038 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-internal-tls-certs\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.799082 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-ovndb-tls-certs\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.799103 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-config\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.806810 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-httpd-config\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.806854 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-public-tls-certs\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.807296 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-internal-tls-certs\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.807953 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-config\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.808506 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-combined-ca-bundle\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.812242 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d235163d-548f-40e3-9aae-490a41523da2-ovndb-tls-certs\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.825536 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgjg8\" (UniqueName: \"kubernetes.io/projected/d235163d-548f-40e3-9aae-490a41523da2-kube-api-access-cgjg8\") pod \"neutron-798b4f9b87-frx5r\" (UID: \"d235163d-548f-40e3-9aae-490a41523da2\") " pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.909658 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea41140-1a15-4e64-89e5-d68b9208dff1" path="/var/lib/kubelet/pods/7ea41140-1a15-4e64-89e5-d68b9208dff1/volumes" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.912782 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.926762 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" event={"ID":"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e","Type":"ContainerStarted","Data":"d59a3e86a73d107d2ef3e7aecd07aff4439ab508b907181c3bda693c666be0e3"} Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.926906 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.928961 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8506c575-4d5d-4691-9ebd-75d8c878f9a9","Type":"ContainerStarted","Data":"b9326a7ea2df5acfc940acd75d3fc2ee9b826a751cb8115960bd3f4a6de4b62b"} Oct 01 10:33:47 crc kubenswrapper[4735]: I1001 10:33:47.963862 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" podStartSLOduration=2.963842242 podStartE2EDuration="2.963842242s" podCreationTimestamp="2025-10-01 10:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:47.954308497 +0000 UTC m=+986.647129769" watchObservedRunningTime="2025-10-01 10:33:47.963842242 +0000 UTC m=+986.656663504" Oct 01 10:33:48 crc kubenswrapper[4735]: I1001 10:33:48.029945 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:48 crc kubenswrapper[4735]: I1001 10:33:48.561544 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-798b4f9b87-frx5r"] Oct 01 10:33:48 crc kubenswrapper[4735]: W1001 10:33:48.567029 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd235163d_548f_40e3_9aae_490a41523da2.slice/crio-40910ceaff3e598406e688ebcd6237157c8004179477233d6b08b0349bfa5620 WatchSource:0}: Error finding container 40910ceaff3e598406e688ebcd6237157c8004179477233d6b08b0349bfa5620: Status 404 returned error can't find the container with id 40910ceaff3e598406e688ebcd6237157c8004179477233d6b08b0349bfa5620 Oct 01 10:33:48 crc kubenswrapper[4735]: I1001 10:33:48.945444 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8506c575-4d5d-4691-9ebd-75d8c878f9a9","Type":"ContainerStarted","Data":"f2fd5fba0b761cf62bfc989e53081dd9793c5448aae946a24d57446b23d5f2fb"} Oct 01 10:33:48 crc kubenswrapper[4735]: I1001 10:33:48.950560 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798b4f9b87-frx5r" event={"ID":"d235163d-548f-40e3-9aae-490a41523da2","Type":"ContainerStarted","Data":"5be5f81ec4ae755f3e1e12274b599f320b5f3dfcec80ca5d223dcd7d1490bc7e"} Oct 01 10:33:48 crc kubenswrapper[4735]: I1001 10:33:48.950593 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798b4f9b87-frx5r" event={"ID":"d235163d-548f-40e3-9aae-490a41523da2","Type":"ContainerStarted","Data":"40910ceaff3e598406e688ebcd6237157c8004179477233d6b08b0349bfa5620"} Oct 01 10:33:49 crc kubenswrapper[4735]: I1001 10:33:49.959831 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8506c575-4d5d-4691-9ebd-75d8c878f9a9","Type":"ContainerStarted","Data":"f16d05092237210f009ea9104512444793fc9c41bc32a45a487845d6d6f77605"} Oct 01 10:33:49 crc kubenswrapper[4735]: I1001 10:33:49.964234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-798b4f9b87-frx5r" event={"ID":"d235163d-548f-40e3-9aae-490a41523da2","Type":"ContainerStarted","Data":"8538876be6a130589d667378fe45e33c33c93aeab1a2b60b2a99d6856af09378"} Oct 01 10:33:49 crc kubenswrapper[4735]: I1001 10:33:49.964524 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:33:49 crc kubenswrapper[4735]: I1001 10:33:49.985845 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-798b4f9b87-frx5r" podStartSLOduration=2.985827716 podStartE2EDuration="2.985827716s" podCreationTimestamp="2025-10-01 10:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:33:49.984653305 +0000 UTC m=+988.677474567" watchObservedRunningTime="2025-10-01 10:33:49.985827716 +0000 UTC m=+988.678648978" Oct 01 10:33:50 crc kubenswrapper[4735]: I1001 10:33:50.974989 4735 generic.go:334] "Generic (PLEG): container finished" podID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerID="7a05d0208176e16ca827de0751205940fe05ef3bd18c2ddf1872862f95be80ef" exitCode=0 Oct 01 10:33:50 crc kubenswrapper[4735]: I1001 10:33:50.975060 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bcff764fb-c7nmm" event={"ID":"45ed423f-4895-4df3-9a04-2b916f38f57d","Type":"ContainerDied","Data":"7a05d0208176e16ca827de0751205940fe05ef3bd18c2ddf1872862f95be80ef"} Oct 01 10:33:52 crc kubenswrapper[4735]: I1001 10:33:52.546723 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bcff764fb-c7nmm" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.003082 4735 generic.go:334] "Generic (PLEG): container finished" podID="dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" containerID="f45297ae1821283c10e00cef6bdb2567eab5061fda027dbd18717bb611d28ac6" exitCode=137 Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.003371 4735 generic.go:334] "Generic (PLEG): container finished" podID="dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" containerID="2f41577d47c185ffd16c8184feeda46c83f713c61adeb5be19109cfa62eb629a" exitCode=137 Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.003382 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c88c854b9-2q2b2" event={"ID":"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9","Type":"ContainerDied","Data":"f45297ae1821283c10e00cef6bdb2567eab5061fda027dbd18717bb611d28ac6"} Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.003577 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c88c854b9-2q2b2" event={"ID":"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9","Type":"ContainerDied","Data":"2f41577d47c185ffd16c8184feeda46c83f713c61adeb5be19109cfa62eb629a"} Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.005989 4735 generic.go:334] "Generic (PLEG): container finished" podID="a580b903-ba8b-44cb-bff3-8ff737dce411" containerID="28770dfd9a0f891204b7de9f740e8148c5335e16103b1203773b90f797591286" exitCode=137 Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.006021 4735 generic.go:334] "Generic (PLEG): container finished" podID="a580b903-ba8b-44cb-bff3-8ff737dce411" containerID="3eede71fcdb5b04f262f296e20150d0aebde6252b04fd29c856e458faf3b4d1a" exitCode=137 Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.006083 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687c8c8cfc-p8gqg" event={"ID":"a580b903-ba8b-44cb-bff3-8ff737dce411","Type":"ContainerDied","Data":"28770dfd9a0f891204b7de9f740e8148c5335e16103b1203773b90f797591286"} Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.006297 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687c8c8cfc-p8gqg" event={"ID":"a580b903-ba8b-44cb-bff3-8ff737dce411","Type":"ContainerDied","Data":"3eede71fcdb5b04f262f296e20150d0aebde6252b04fd29c856e458faf3b4d1a"} Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.007968 4735 generic.go:334] "Generic (PLEG): container finished" podID="db7fa1b4-c7fb-4b97-85be-4fec151375fa" containerID="4f6d808868f636be2719fc1bec51ec6244cd5da7cb855be99c24add393175bf4" exitCode=137 Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.008007 4735 generic.go:334] "Generic (PLEG): container finished" podID="db7fa1b4-c7fb-4b97-85be-4fec151375fa" containerID="dc1f2ab489b9c41106f18065904e23f2a61fe118fb0a8eb2a0fd916439f99627" exitCode=137 Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.008038 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cd5d6d777-529hb" event={"ID":"db7fa1b4-c7fb-4b97-85be-4fec151375fa","Type":"ContainerDied","Data":"4f6d808868f636be2719fc1bec51ec6244cd5da7cb855be99c24add393175bf4"} Oct 01 10:33:53 crc kubenswrapper[4735]: I1001 10:33:53.008072 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cd5d6d777-529hb" event={"ID":"db7fa1b4-c7fb-4b97-85be-4fec151375fa","Type":"ContainerDied","Data":"dc1f2ab489b9c41106f18065904e23f2a61fe118fb0a8eb2a0fd916439f99627"} Oct 01 10:33:55 crc kubenswrapper[4735]: I1001 10:33:55.450814 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:33:55 crc kubenswrapper[4735]: I1001 10:33:55.532275 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-xjfvr"] Oct 01 10:33:55 crc kubenswrapper[4735]: I1001 10:33:55.532595 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" podUID="07675ec8-d5bc-450f-a1af-dd92e82f7696" containerName="dnsmasq-dns" containerID="cri-o://fbc1d1928b1bed5b4a3c7fabed3cd6149ead79c3cccff96b2415ff3f20fd203c" gracePeriod=10 Oct 01 10:33:55 crc kubenswrapper[4735]: I1001 10:33:55.974271 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.005832 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.018516 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.046273 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687c8c8cfc-p8gqg" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.046289 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687c8c8cfc-p8gqg" event={"ID":"a580b903-ba8b-44cb-bff3-8ff737dce411","Type":"ContainerDied","Data":"66f39faa86d65dbaf4bb9a88b859ca67b4a01f82d67c07474667d7c54af8cbf7"} Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.046765 4735 scope.go:117] "RemoveContainer" containerID="28770dfd9a0f891204b7de9f740e8148c5335e16103b1203773b90f797591286" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.061801 4735 generic.go:334] "Generic (PLEG): container finished" podID="07675ec8-d5bc-450f-a1af-dd92e82f7696" containerID="fbc1d1928b1bed5b4a3c7fabed3cd6149ead79c3cccff96b2415ff3f20fd203c" exitCode=0 Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.065909 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" event={"ID":"07675ec8-d5bc-450f-a1af-dd92e82f7696","Type":"ContainerDied","Data":"fbc1d1928b1bed5b4a3c7fabed3cd6149ead79c3cccff96b2415ff3f20fd203c"} Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.080696 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cd5d6d777-529hb" event={"ID":"db7fa1b4-c7fb-4b97-85be-4fec151375fa","Type":"ContainerDied","Data":"8ee347a787b48b4ce4b7a1980acfd67a24c0b120e1bdc0c840b6dd75db444a69"} Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.080796 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cd5d6d777-529hb" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.084394 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c88c854b9-2q2b2" event={"ID":"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9","Type":"ContainerDied","Data":"b0730ec3f02e35390c651ddf406a8bdfec64dafc6a173a790f4862099f0712dc"} Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.084442 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c88c854b9-2q2b2" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.166872 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-scripts\") pod \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.166945 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xt2n\" (UniqueName: \"kubernetes.io/projected/a580b903-ba8b-44cb-bff3-8ff737dce411-kube-api-access-2xt2n\") pod \"a580b903-ba8b-44cb-bff3-8ff737dce411\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.166988 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/db7fa1b4-c7fb-4b97-85be-4fec151375fa-horizon-secret-key\") pod \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167014 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a580b903-ba8b-44cb-bff3-8ff737dce411-horizon-secret-key\") pod \"a580b903-ba8b-44cb-bff3-8ff737dce411\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167042 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-scripts\") pod \"a580b903-ba8b-44cb-bff3-8ff737dce411\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167086 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr62m\" (UniqueName: \"kubernetes.io/projected/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-kube-api-access-vr62m\") pod \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167130 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-config-data\") pod \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167155 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-horizon-secret-key\") pod \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167185 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a580b903-ba8b-44cb-bff3-8ff737dce411-logs\") pod \"a580b903-ba8b-44cb-bff3-8ff737dce411\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167207 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-config-data\") pod \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167246 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db7fa1b4-c7fb-4b97-85be-4fec151375fa-logs\") pod \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167283 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4479x\" (UniqueName: \"kubernetes.io/projected/db7fa1b4-c7fb-4b97-85be-4fec151375fa-kube-api-access-4479x\") pod \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\" (UID: \"db7fa1b4-c7fb-4b97-85be-4fec151375fa\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167328 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-logs\") pod \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167354 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-scripts\") pod \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\" (UID: \"dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167383 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-config-data\") pod \"a580b903-ba8b-44cb-bff3-8ff737dce411\" (UID: \"a580b903-ba8b-44cb-bff3-8ff737dce411\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.167995 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a580b903-ba8b-44cb-bff3-8ff737dce411-logs" (OuterVolumeSpecName: "logs") pod "a580b903-ba8b-44cb-bff3-8ff737dce411" (UID: "a580b903-ba8b-44cb-bff3-8ff737dce411"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.168631 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-logs" (OuterVolumeSpecName: "logs") pod "dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" (UID: "dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.168634 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db7fa1b4-c7fb-4b97-85be-4fec151375fa-logs" (OuterVolumeSpecName: "logs") pod "db7fa1b4-c7fb-4b97-85be-4fec151375fa" (UID: "db7fa1b4-c7fb-4b97-85be-4fec151375fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.174262 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-kube-api-access-vr62m" (OuterVolumeSpecName: "kube-api-access-vr62m") pod "dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" (UID: "dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9"). InnerVolumeSpecName "kube-api-access-vr62m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.176239 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" (UID: "dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.177645 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a580b903-ba8b-44cb-bff3-8ff737dce411-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a580b903-ba8b-44cb-bff3-8ff737dce411" (UID: "a580b903-ba8b-44cb-bff3-8ff737dce411"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.179986 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db7fa1b4-c7fb-4b97-85be-4fec151375fa-kube-api-access-4479x" (OuterVolumeSpecName: "kube-api-access-4479x") pod "db7fa1b4-c7fb-4b97-85be-4fec151375fa" (UID: "db7fa1b4-c7fb-4b97-85be-4fec151375fa"). InnerVolumeSpecName "kube-api-access-4479x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.184795 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a580b903-ba8b-44cb-bff3-8ff737dce411-kube-api-access-2xt2n" (OuterVolumeSpecName: "kube-api-access-2xt2n") pod "a580b903-ba8b-44cb-bff3-8ff737dce411" (UID: "a580b903-ba8b-44cb-bff3-8ff737dce411"). InnerVolumeSpecName "kube-api-access-2xt2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.198634 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db7fa1b4-c7fb-4b97-85be-4fec151375fa-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "db7fa1b4-c7fb-4b97-85be-4fec151375fa" (UID: "db7fa1b4-c7fb-4b97-85be-4fec151375fa"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.208930 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-config-data" (OuterVolumeSpecName: "config-data") pod "db7fa1b4-c7fb-4b97-85be-4fec151375fa" (UID: "db7fa1b4-c7fb-4b97-85be-4fec151375fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.209787 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-scripts" (OuterVolumeSpecName: "scripts") pod "db7fa1b4-c7fb-4b97-85be-4fec151375fa" (UID: "db7fa1b4-c7fb-4b97-85be-4fec151375fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.210462 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-scripts" (OuterVolumeSpecName: "scripts") pod "a580b903-ba8b-44cb-bff3-8ff737dce411" (UID: "a580b903-ba8b-44cb-bff3-8ff737dce411"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.214713 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-scripts" (OuterVolumeSpecName: "scripts") pod "dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" (UID: "dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.216312 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-config-data" (OuterVolumeSpecName: "config-data") pod "a580b903-ba8b-44cb-bff3-8ff737dce411" (UID: "a580b903-ba8b-44cb-bff3-8ff737dce411"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.235453 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-config-data" (OuterVolumeSpecName: "config-data") pod "dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" (UID: "dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272552 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272602 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xt2n\" (UniqueName: \"kubernetes.io/projected/a580b903-ba8b-44cb-bff3-8ff737dce411-kube-api-access-2xt2n\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272616 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/db7fa1b4-c7fb-4b97-85be-4fec151375fa-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272625 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a580b903-ba8b-44cb-bff3-8ff737dce411-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272635 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272645 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr62m\" (UniqueName: \"kubernetes.io/projected/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-kube-api-access-vr62m\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272654 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db7fa1b4-c7fb-4b97-85be-4fec151375fa-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272663 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272671 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a580b903-ba8b-44cb-bff3-8ff737dce411-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272678 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272686 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db7fa1b4-c7fb-4b97-85be-4fec151375fa-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272694 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4479x\" (UniqueName: \"kubernetes.io/projected/db7fa1b4-c7fb-4b97-85be-4fec151375fa-kube-api-access-4479x\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272701 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272710 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.272719 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a580b903-ba8b-44cb-bff3-8ff737dce411-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.288741 4735 scope.go:117] "RemoveContainer" containerID="3eede71fcdb5b04f262f296e20150d0aebde6252b04fd29c856e458faf3b4d1a" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.332464 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.341315 4735 scope.go:117] "RemoveContainer" containerID="4f6d808868f636be2719fc1bec51ec6244cd5da7cb855be99c24add393175bf4" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.385128 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-687c8c8cfc-p8gqg"] Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.391587 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-687c8c8cfc-p8gqg"] Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.419430 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c88c854b9-2q2b2"] Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.429736 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c88c854b9-2q2b2"] Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.434568 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cd5d6d777-529hb"] Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.441195 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-cd5d6d777-529hb"] Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.476271 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-sb\") pod \"07675ec8-d5bc-450f-a1af-dd92e82f7696\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.476366 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-svc\") pod \"07675ec8-d5bc-450f-a1af-dd92e82f7696\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.476391 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-config\") pod \"07675ec8-d5bc-450f-a1af-dd92e82f7696\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.476414 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-nb\") pod \"07675ec8-d5bc-450f-a1af-dd92e82f7696\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.476459 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-swift-storage-0\") pod \"07675ec8-d5bc-450f-a1af-dd92e82f7696\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.476477 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxwzv\" (UniqueName: \"kubernetes.io/projected/07675ec8-d5bc-450f-a1af-dd92e82f7696-kube-api-access-nxwzv\") pod \"07675ec8-d5bc-450f-a1af-dd92e82f7696\" (UID: \"07675ec8-d5bc-450f-a1af-dd92e82f7696\") " Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.480559 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07675ec8-d5bc-450f-a1af-dd92e82f7696-kube-api-access-nxwzv" (OuterVolumeSpecName: "kube-api-access-nxwzv") pod "07675ec8-d5bc-450f-a1af-dd92e82f7696" (UID: "07675ec8-d5bc-450f-a1af-dd92e82f7696"). InnerVolumeSpecName "kube-api-access-nxwzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.521651 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07675ec8-d5bc-450f-a1af-dd92e82f7696" (UID: "07675ec8-d5bc-450f-a1af-dd92e82f7696"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.522134 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07675ec8-d5bc-450f-a1af-dd92e82f7696" (UID: "07675ec8-d5bc-450f-a1af-dd92e82f7696"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.523074 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07675ec8-d5bc-450f-a1af-dd92e82f7696" (UID: "07675ec8-d5bc-450f-a1af-dd92e82f7696"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.524457 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-config" (OuterVolumeSpecName: "config") pod "07675ec8-d5bc-450f-a1af-dd92e82f7696" (UID: "07675ec8-d5bc-450f-a1af-dd92e82f7696"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.527464 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "07675ec8-d5bc-450f-a1af-dd92e82f7696" (UID: "07675ec8-d5bc-450f-a1af-dd92e82f7696"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.551269 4735 scope.go:117] "RemoveContainer" containerID="dc1f2ab489b9c41106f18065904e23f2a61fe118fb0a8eb2a0fd916439f99627" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.575321 4735 scope.go:117] "RemoveContainer" containerID="f45297ae1821283c10e00cef6bdb2567eab5061fda027dbd18717bb611d28ac6" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.578733 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.578751 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.578776 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.578784 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.578794 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxwzv\" (UniqueName: \"kubernetes.io/projected/07675ec8-d5bc-450f-a1af-dd92e82f7696-kube-api-access-nxwzv\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.578802 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07675ec8-d5bc-450f-a1af-dd92e82f7696-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:33:56 crc kubenswrapper[4735]: I1001 10:33:56.745789 4735 scope.go:117] "RemoveContainer" containerID="2f41577d47c185ffd16c8184feeda46c83f713c61adeb5be19109cfa62eb629a" Oct 01 10:33:57 crc kubenswrapper[4735]: I1001 10:33:57.094598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8506c575-4d5d-4691-9ebd-75d8c878f9a9","Type":"ContainerStarted","Data":"77bae45e0a3516c6625bd67820d03f8c1de32b8759457e0fb94b7e4e62b09187"} Oct 01 10:33:57 crc kubenswrapper[4735]: I1001 10:33:57.098111 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" event={"ID":"07675ec8-d5bc-450f-a1af-dd92e82f7696","Type":"ContainerDied","Data":"041f729007a1a297ac8706f5669857264b9163b5b802b96e7962cb369573a589"} Oct 01 10:33:57 crc kubenswrapper[4735]: I1001 10:33:57.098141 4735 scope.go:117] "RemoveContainer" containerID="fbc1d1928b1bed5b4a3c7fabed3cd6149ead79c3cccff96b2415ff3f20fd203c" Oct 01 10:33:57 crc kubenswrapper[4735]: I1001 10:33:57.098232 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-xjfvr" Oct 01 10:33:57 crc kubenswrapper[4735]: I1001 10:33:57.139426 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-xjfvr"] Oct 01 10:33:57 crc kubenswrapper[4735]: I1001 10:33:57.145216 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-xjfvr"] Oct 01 10:33:57 crc kubenswrapper[4735]: I1001 10:33:57.909011 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07675ec8-d5bc-450f-a1af-dd92e82f7696" path="/var/lib/kubelet/pods/07675ec8-d5bc-450f-a1af-dd92e82f7696/volumes" Oct 01 10:33:57 crc kubenswrapper[4735]: I1001 10:33:57.910347 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a580b903-ba8b-44cb-bff3-8ff737dce411" path="/var/lib/kubelet/pods/a580b903-ba8b-44cb-bff3-8ff737dce411/volumes" Oct 01 10:33:57 crc kubenswrapper[4735]: I1001 10:33:57.911925 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db7fa1b4-c7fb-4b97-85be-4fec151375fa" path="/var/lib/kubelet/pods/db7fa1b4-c7fb-4b97-85be-4fec151375fa/volumes" Oct 01 10:33:57 crc kubenswrapper[4735]: I1001 10:33:57.914375 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" path="/var/lib/kubelet/pods/dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9/volumes" Oct 01 10:33:59 crc kubenswrapper[4735]: I1001 10:33:59.590340 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7888d7549b-dbpkr" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.323774 4735 scope.go:117] "RemoveContainer" containerID="da4392475356db13bbea088b279054b8073d3fda2f54cc090029c533150ff6df" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.366891 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 01 10:34:01 crc kubenswrapper[4735]: E1001 10:34:01.367614 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" containerName="horizon-log" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367630 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" containerName="horizon-log" Oct 01 10:34:01 crc kubenswrapper[4735]: E1001 10:34:01.367639 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" containerName="horizon" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367645 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" containerName="horizon" Oct 01 10:34:01 crc kubenswrapper[4735]: E1001 10:34:01.367660 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a580b903-ba8b-44cb-bff3-8ff737dce411" containerName="horizon-log" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367666 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a580b903-ba8b-44cb-bff3-8ff737dce411" containerName="horizon-log" Oct 01 10:34:01 crc kubenswrapper[4735]: E1001 10:34:01.367683 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07675ec8-d5bc-450f-a1af-dd92e82f7696" containerName="dnsmasq-dns" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367691 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="07675ec8-d5bc-450f-a1af-dd92e82f7696" containerName="dnsmasq-dns" Oct 01 10:34:01 crc kubenswrapper[4735]: E1001 10:34:01.367701 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7fa1b4-c7fb-4b97-85be-4fec151375fa" containerName="horizon-log" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367708 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7fa1b4-c7fb-4b97-85be-4fec151375fa" containerName="horizon-log" Oct 01 10:34:01 crc kubenswrapper[4735]: E1001 10:34:01.367731 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a580b903-ba8b-44cb-bff3-8ff737dce411" containerName="horizon" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367737 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a580b903-ba8b-44cb-bff3-8ff737dce411" containerName="horizon" Oct 01 10:34:01 crc kubenswrapper[4735]: E1001 10:34:01.367753 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07675ec8-d5bc-450f-a1af-dd92e82f7696" containerName="init" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367759 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="07675ec8-d5bc-450f-a1af-dd92e82f7696" containerName="init" Oct 01 10:34:01 crc kubenswrapper[4735]: E1001 10:34:01.367769 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7fa1b4-c7fb-4b97-85be-4fec151375fa" containerName="horizon" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367775 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7fa1b4-c7fb-4b97-85be-4fec151375fa" containerName="horizon" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367939 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" containerName="horizon-log" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367957 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a580b903-ba8b-44cb-bff3-8ff737dce411" containerName="horizon-log" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367967 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf01b4b-4dd5-49c3-bc77-b8bc0b9b0fc9" containerName="horizon" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367977 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="07675ec8-d5bc-450f-a1af-dd92e82f7696" containerName="dnsmasq-dns" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.367986 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="db7fa1b4-c7fb-4b97-85be-4fec151375fa" containerName="horizon" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.368000 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a580b903-ba8b-44cb-bff3-8ff737dce411" containerName="horizon" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.368007 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="db7fa1b4-c7fb-4b97-85be-4fec151375fa" containerName="horizon-log" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.368585 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.371069 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-mrspx" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.371282 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.371427 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.396304 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.560912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-openstack-config-secret\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.561001 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28dcc\" (UniqueName: \"kubernetes.io/projected/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-kube-api-access-28dcc\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.561069 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-openstack-config\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.561108 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.662375 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-openstack-config\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.662442 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.662677 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-openstack-config-secret\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.663218 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-openstack-config\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.663555 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28dcc\" (UniqueName: \"kubernetes.io/projected/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-kube-api-access-28dcc\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.670663 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-openstack-config-secret\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.676298 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.682159 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28dcc\" (UniqueName: \"kubernetes.io/projected/c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5-kube-api-access-28dcc\") pod \"openstackclient\" (UID: \"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5\") " pod="openstack/openstackclient" Oct 01 10:34:01 crc kubenswrapper[4735]: I1001 10:34:01.697280 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 10:34:02 crc kubenswrapper[4735]: I1001 10:34:02.544567 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bcff764fb-c7nmm" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 01 10:34:04 crc kubenswrapper[4735]: I1001 10:34:04.507623 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:34:04 crc kubenswrapper[4735]: I1001 10:34:04.717006 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-85c495dfd8-k8nc5" Oct 01 10:34:05 crc kubenswrapper[4735]: I1001 10:34:05.485445 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:34:05 crc kubenswrapper[4735]: I1001 10:34:05.485766 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:34:05 crc kubenswrapper[4735]: I1001 10:34:05.485812 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:34:05 crc kubenswrapper[4735]: I1001 10:34:05.486632 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d509f3e1d9829219adbe6f0a296874023b5cdfe25a87df90afeebbd5d68c288"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 10:34:05 crc kubenswrapper[4735]: I1001 10:34:05.486697 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://1d509f3e1d9829219adbe6f0a296874023b5cdfe25a87df90afeebbd5d68c288" gracePeriod=600 Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.183844 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="1d509f3e1d9829219adbe6f0a296874023b5cdfe25a87df90afeebbd5d68c288" exitCode=0 Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.183885 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"1d509f3e1d9829219adbe6f0a296874023b5cdfe25a87df90afeebbd5d68c288"} Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.584478 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.682127 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-696cd688cf-kqrbf"] Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.683695 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.688332 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.688410 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.688627 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.704614 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-696cd688cf-kqrbf"] Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.850613 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-combined-ca-bundle\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.850668 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/95643272-0db0-4c04-9087-98321b57c893-etc-swift\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.850698 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mfm\" (UniqueName: \"kubernetes.io/projected/95643272-0db0-4c04-9087-98321b57c893-kube-api-access-r7mfm\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.850720 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95643272-0db0-4c04-9087-98321b57c893-run-httpd\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.850753 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-config-data\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.850820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-public-tls-certs\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.850848 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-internal-tls-certs\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.850895 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95643272-0db0-4c04-9087-98321b57c893-log-httpd\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.952424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-combined-ca-bundle\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.952481 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/95643272-0db0-4c04-9087-98321b57c893-etc-swift\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.952519 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mfm\" (UniqueName: \"kubernetes.io/projected/95643272-0db0-4c04-9087-98321b57c893-kube-api-access-r7mfm\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.952541 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95643272-0db0-4c04-9087-98321b57c893-run-httpd\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.952684 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-config-data\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.952703 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-public-tls-certs\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.952730 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-internal-tls-certs\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.952785 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95643272-0db0-4c04-9087-98321b57c893-log-httpd\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.953199 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95643272-0db0-4c04-9087-98321b57c893-log-httpd\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.953254 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95643272-0db0-4c04-9087-98321b57c893-run-httpd\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.959966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-config-data\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.960006 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-public-tls-certs\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.960698 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-combined-ca-bundle\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.965179 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95643272-0db0-4c04-9087-98321b57c893-internal-tls-certs\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.967741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/95643272-0db0-4c04-9087-98321b57c893-etc-swift\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:06 crc kubenswrapper[4735]: I1001 10:34:06.970164 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mfm\" (UniqueName: \"kubernetes.io/projected/95643272-0db0-4c04-9087-98321b57c893-kube-api-access-r7mfm\") pod \"swift-proxy-696cd688cf-kqrbf\" (UID: \"95643272-0db0-4c04-9087-98321b57c893\") " pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:07 crc kubenswrapper[4735]: I1001 10:34:07.001479 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:10 crc kubenswrapper[4735]: I1001 10:34:10.590486 4735 scope.go:117] "RemoveContainer" containerID="2af2e089638b91b79b45466ec706a3fb255cc6d4b97f5b95a6d6830b44a807b5" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.029308 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zjsng"] Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.037746 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zjsng" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.045916 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zjsng"] Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.133225 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-lcc6x"] Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.134760 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lcc6x" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.152474 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lcc6x"] Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.154233 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5tp9\" (UniqueName: \"kubernetes.io/projected/f6039fc3-1cee-4902-81bd-0f35cf2eaa96-kube-api-access-n5tp9\") pod \"nova-cell0-db-create-lcc6x\" (UID: \"f6039fc3-1cee-4902-81bd-0f35cf2eaa96\") " pod="openstack/nova-cell0-db-create-lcc6x" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.154391 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sntxx\" (UniqueName: \"kubernetes.io/projected/3aac33e2-e90f-4db8-95a4-b676416a6781-kube-api-access-sntxx\") pod \"nova-api-db-create-zjsng\" (UID: \"3aac33e2-e90f-4db8-95a4-b676416a6781\") " pod="openstack/nova-api-db-create-zjsng" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.255854 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sntxx\" (UniqueName: \"kubernetes.io/projected/3aac33e2-e90f-4db8-95a4-b676416a6781-kube-api-access-sntxx\") pod \"nova-api-db-create-zjsng\" (UID: \"3aac33e2-e90f-4db8-95a4-b676416a6781\") " pod="openstack/nova-api-db-create-zjsng" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.255955 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5tp9\" (UniqueName: \"kubernetes.io/projected/f6039fc3-1cee-4902-81bd-0f35cf2eaa96-kube-api-access-n5tp9\") pod \"nova-cell0-db-create-lcc6x\" (UID: \"f6039fc3-1cee-4902-81bd-0f35cf2eaa96\") " pod="openstack/nova-cell0-db-create-lcc6x" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.275438 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5tp9\" (UniqueName: \"kubernetes.io/projected/f6039fc3-1cee-4902-81bd-0f35cf2eaa96-kube-api-access-n5tp9\") pod \"nova-cell0-db-create-lcc6x\" (UID: \"f6039fc3-1cee-4902-81bd-0f35cf2eaa96\") " pod="openstack/nova-cell0-db-create-lcc6x" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.287137 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sntxx\" (UniqueName: \"kubernetes.io/projected/3aac33e2-e90f-4db8-95a4-b676416a6781-kube-api-access-sntxx\") pod \"nova-api-db-create-zjsng\" (UID: \"3aac33e2-e90f-4db8-95a4-b676416a6781\") " pod="openstack/nova-api-db-create-zjsng" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.327093 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5nzjx"] Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.328267 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5nzjx" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.341179 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5nzjx"] Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.357162 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zjsng" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.357734 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gk7g\" (UniqueName: \"kubernetes.io/projected/8163cb1d-c1d1-48cd-8e45-6f239a7095c1-kube-api-access-6gk7g\") pod \"nova-cell1-db-create-5nzjx\" (UID: \"8163cb1d-c1d1-48cd-8e45-6f239a7095c1\") " pod="openstack/nova-cell1-db-create-5nzjx" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.453961 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lcc6x" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.459657 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gk7g\" (UniqueName: \"kubernetes.io/projected/8163cb1d-c1d1-48cd-8e45-6f239a7095c1-kube-api-access-6gk7g\") pod \"nova-cell1-db-create-5nzjx\" (UID: \"8163cb1d-c1d1-48cd-8e45-6f239a7095c1\") " pod="openstack/nova-cell1-db-create-5nzjx" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.481529 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gk7g\" (UniqueName: \"kubernetes.io/projected/8163cb1d-c1d1-48cd-8e45-6f239a7095c1-kube-api-access-6gk7g\") pod \"nova-cell1-db-create-5nzjx\" (UID: \"8163cb1d-c1d1-48cd-8e45-6f239a7095c1\") " pod="openstack/nova-cell1-db-create-5nzjx" Oct 01 10:34:11 crc kubenswrapper[4735]: I1001 10:34:11.680011 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5nzjx" Oct 01 10:34:12 crc kubenswrapper[4735]: E1001 10:34:12.168971 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 01 10:34:12 crc kubenswrapper[4735]: E1001 10:34:12.169122 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbrm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-84vvz_openstack(61dd2f37-7f60-42f5-a3d0-3b693d1e64be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 10:34:12 crc kubenswrapper[4735]: E1001 10:34:12.170706 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-84vvz" podUID="61dd2f37-7f60-42f5-a3d0-3b693d1e64be" Oct 01 10:34:12 crc kubenswrapper[4735]: I1001 10:34:12.544737 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bcff764fb-c7nmm" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 01 10:34:12 crc kubenswrapper[4735]: I1001 10:34:12.545230 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:34:12 crc kubenswrapper[4735]: I1001 10:34:12.763086 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zjsng"] Oct 01 10:34:12 crc kubenswrapper[4735]: I1001 10:34:12.769525 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lcc6x"] Oct 01 10:34:12 crc kubenswrapper[4735]: I1001 10:34:12.775054 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 10:34:12 crc kubenswrapper[4735]: I1001 10:34:12.780371 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5nzjx"] Oct 01 10:34:12 crc kubenswrapper[4735]: I1001 10:34:12.836038 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-696cd688cf-kqrbf"] Oct 01 10:34:12 crc kubenswrapper[4735]: W1001 10:34:12.838586 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95643272_0db0_4c04_9087_98321b57c893.slice/crio-c1549d28e91ea14f82684bd22ae0e32f5bf7ad8258f2c980d69a655aa4c234a6 WatchSource:0}: Error finding container c1549d28e91ea14f82684bd22ae0e32f5bf7ad8258f2c980d69a655aa4c234a6: Status 404 returned error can't find the container with id c1549d28e91ea14f82684bd22ae0e32f5bf7ad8258f2c980d69a655aa4c234a6 Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.269741 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"6042cbc8542ef3e83bd7d2006832c7a2d565a12a6c7538fc83d415163087f591"} Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.272592 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8cvdn" event={"ID":"541784dc-4146-459d-bee0-2f97d22a7977","Type":"ContainerStarted","Data":"5688c7fc60494353be2b6ebfcefc98759a71666e60aa632e6319fbe57301999d"} Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.275264 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lcc6x" event={"ID":"f6039fc3-1cee-4902-81bd-0f35cf2eaa96","Type":"ContainerStarted","Data":"a97369da3fa95662846cba8da6be9a0d38d7dd64fd9aa8205804c9c4faa7904f"} Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.275287 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lcc6x" event={"ID":"f6039fc3-1cee-4902-81bd-0f35cf2eaa96","Type":"ContainerStarted","Data":"e226ed983ccd85a2b692090516b07906663bb07c9da43057f289ce293e4c6297"} Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.277782 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5nzjx" event={"ID":"8163cb1d-c1d1-48cd-8e45-6f239a7095c1","Type":"ContainerStarted","Data":"beb2507f6d4c56e9075de1d8ca34ff241ef528a92894be152b5b153d79b01c12"} Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.277826 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5nzjx" event={"ID":"8163cb1d-c1d1-48cd-8e45-6f239a7095c1","Type":"ContainerStarted","Data":"3269cce07dda2258ea0def6cf3cd8ad7a91c084b2cce31f5faa71c4dc1d3dcd5"} Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.279927 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zjsng" event={"ID":"3aac33e2-e90f-4db8-95a4-b676416a6781","Type":"ContainerStarted","Data":"c3ceca45415b38d092d174cc57e27229ae5db3eee3de8baeae4b56e8e5c00c68"} Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.279987 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zjsng" event={"ID":"3aac33e2-e90f-4db8-95a4-b676416a6781","Type":"ContainerStarted","Data":"39e51bbb806122852d4261f3ac6b7dec1b6c4165ee8ead257177c181562050c0"} Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.281686 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-696cd688cf-kqrbf" event={"ID":"95643272-0db0-4c04-9087-98321b57c893","Type":"ContainerStarted","Data":"35f0556d040e8a24c8641e49eadea95ec1452d7b4b466cede1279e1d41349836"} Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.281715 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-696cd688cf-kqrbf" event={"ID":"95643272-0db0-4c04-9087-98321b57c893","Type":"ContainerStarted","Data":"c1549d28e91ea14f82684bd22ae0e32f5bf7ad8258f2c980d69a655aa4c234a6"} Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.282911 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5","Type":"ContainerStarted","Data":"343718ee85fee443199d1f34d9a44bdb270ba45d4a2029e25ae02f305077b44b"} Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.306013 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-lcc6x" podStartSLOduration=2.3059977959999998 podStartE2EDuration="2.305997796s" podCreationTimestamp="2025-10-01 10:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:34:13.304392843 +0000 UTC m=+1011.997214105" watchObservedRunningTime="2025-10-01 10:34:13.305997796 +0000 UTC m=+1011.998819058" Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.324716 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-zjsng" podStartSLOduration=2.3247002070000002 podStartE2EDuration="2.324700207s" podCreationTimestamp="2025-10-01 10:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:34:13.318134101 +0000 UTC m=+1012.010955363" watchObservedRunningTime="2025-10-01 10:34:13.324700207 +0000 UTC m=+1012.017521469" Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.338449 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-8cvdn" podStartSLOduration=2.095016192 podStartE2EDuration="46.338434083s" podCreationTimestamp="2025-10-01 10:33:27 +0000 UTC" firstStartedPulling="2025-10-01 10:33:27.966676662 +0000 UTC m=+966.659497924" lastFinishedPulling="2025-10-01 10:34:12.210094553 +0000 UTC m=+1010.902915815" observedRunningTime="2025-10-01 10:34:13.334781366 +0000 UTC m=+1012.027602628" watchObservedRunningTime="2025-10-01 10:34:13.338434083 +0000 UTC m=+1012.031255345" Oct 01 10:34:13 crc kubenswrapper[4735]: I1001 10:34:13.348584 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-5nzjx" podStartSLOduration=2.348567915 podStartE2EDuration="2.348567915s" podCreationTimestamp="2025-10-01 10:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:34:13.345881093 +0000 UTC m=+1012.038702355" watchObservedRunningTime="2025-10-01 10:34:13.348567915 +0000 UTC m=+1012.041389177" Oct 01 10:34:14 crc kubenswrapper[4735]: I1001 10:34:14.293361 4735 generic.go:334] "Generic (PLEG): container finished" podID="f6039fc3-1cee-4902-81bd-0f35cf2eaa96" containerID="a97369da3fa95662846cba8da6be9a0d38d7dd64fd9aa8205804c9c4faa7904f" exitCode=0 Oct 01 10:34:14 crc kubenswrapper[4735]: I1001 10:34:14.293533 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lcc6x" event={"ID":"f6039fc3-1cee-4902-81bd-0f35cf2eaa96","Type":"ContainerDied","Data":"a97369da3fa95662846cba8da6be9a0d38d7dd64fd9aa8205804c9c4faa7904f"} Oct 01 10:34:14 crc kubenswrapper[4735]: I1001 10:34:14.297006 4735 generic.go:334] "Generic (PLEG): container finished" podID="8163cb1d-c1d1-48cd-8e45-6f239a7095c1" containerID="beb2507f6d4c56e9075de1d8ca34ff241ef528a92894be152b5b153d79b01c12" exitCode=0 Oct 01 10:34:14 crc kubenswrapper[4735]: I1001 10:34:14.297059 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5nzjx" event={"ID":"8163cb1d-c1d1-48cd-8e45-6f239a7095c1","Type":"ContainerDied","Data":"beb2507f6d4c56e9075de1d8ca34ff241ef528a92894be152b5b153d79b01c12"} Oct 01 10:34:14 crc kubenswrapper[4735]: I1001 10:34:14.298399 4735 generic.go:334] "Generic (PLEG): container finished" podID="3aac33e2-e90f-4db8-95a4-b676416a6781" containerID="c3ceca45415b38d092d174cc57e27229ae5db3eee3de8baeae4b56e8e5c00c68" exitCode=0 Oct 01 10:34:14 crc kubenswrapper[4735]: I1001 10:34:14.298443 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zjsng" event={"ID":"3aac33e2-e90f-4db8-95a4-b676416a6781","Type":"ContainerDied","Data":"c3ceca45415b38d092d174cc57e27229ae5db3eee3de8baeae4b56e8e5c00c68"} Oct 01 10:34:14 crc kubenswrapper[4735]: I1001 10:34:14.300522 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-696cd688cf-kqrbf" event={"ID":"95643272-0db0-4c04-9087-98321b57c893","Type":"ContainerStarted","Data":"3445c25ddd824265660f460364e62f317d15b4fa58e7c94c29339ad4f235944e"} Oct 01 10:34:14 crc kubenswrapper[4735]: I1001 10:34:14.300592 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:14 crc kubenswrapper[4735]: I1001 10:34:14.300757 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:14 crc kubenswrapper[4735]: I1001 10:34:14.347067 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-696cd688cf-kqrbf" podStartSLOduration=8.347040962 podStartE2EDuration="8.347040962s" podCreationTimestamp="2025-10-01 10:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:34:14.336765837 +0000 UTC m=+1013.029587099" watchObservedRunningTime="2025-10-01 10:34:14.347040962 +0000 UTC m=+1013.039862224" Oct 01 10:34:15 crc kubenswrapper[4735]: I1001 10:34:15.546654 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:34:16 crc kubenswrapper[4735]: I1001 10:34:16.706259 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lcc6x" Oct 01 10:34:16 crc kubenswrapper[4735]: I1001 10:34:16.768250 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5tp9\" (UniqueName: \"kubernetes.io/projected/f6039fc3-1cee-4902-81bd-0f35cf2eaa96-kube-api-access-n5tp9\") pod \"f6039fc3-1cee-4902-81bd-0f35cf2eaa96\" (UID: \"f6039fc3-1cee-4902-81bd-0f35cf2eaa96\") " Oct 01 10:34:16 crc kubenswrapper[4735]: I1001 10:34:16.773964 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6039fc3-1cee-4902-81bd-0f35cf2eaa96-kube-api-access-n5tp9" (OuterVolumeSpecName: "kube-api-access-n5tp9") pod "f6039fc3-1cee-4902-81bd-0f35cf2eaa96" (UID: "f6039fc3-1cee-4902-81bd-0f35cf2eaa96"). InnerVolumeSpecName "kube-api-access-n5tp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:16 crc kubenswrapper[4735]: I1001 10:34:16.870251 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5tp9\" (UniqueName: \"kubernetes.io/projected/f6039fc3-1cee-4902-81bd-0f35cf2eaa96-kube-api-access-n5tp9\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.338803 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lcc6x" event={"ID":"f6039fc3-1cee-4902-81bd-0f35cf2eaa96","Type":"ContainerDied","Data":"e226ed983ccd85a2b692090516b07906663bb07c9da43057f289ce293e4c6297"} Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.339080 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e226ed983ccd85a2b692090516b07906663bb07c9da43057f289ce293e4c6297" Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.338827 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lcc6x" Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.341149 4735 generic.go:334] "Generic (PLEG): container finished" podID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerID="3c26fba4b03c5d2ead52a8171b3d00ed23efc77ca18ab4fe450fd043e2e25cb4" exitCode=137 Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.341195 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bcff764fb-c7nmm" event={"ID":"45ed423f-4895-4df3-9a04-2b916f38f57d","Type":"ContainerDied","Data":"3c26fba4b03c5d2ead52a8171b3d00ed23efc77ca18ab4fe450fd043e2e25cb4"} Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.516616 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5nzjx" Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.527445 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zjsng" Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.581746 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sntxx\" (UniqueName: \"kubernetes.io/projected/3aac33e2-e90f-4db8-95a4-b676416a6781-kube-api-access-sntxx\") pod \"3aac33e2-e90f-4db8-95a4-b676416a6781\" (UID: \"3aac33e2-e90f-4db8-95a4-b676416a6781\") " Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.581802 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gk7g\" (UniqueName: \"kubernetes.io/projected/8163cb1d-c1d1-48cd-8e45-6f239a7095c1-kube-api-access-6gk7g\") pod \"8163cb1d-c1d1-48cd-8e45-6f239a7095c1\" (UID: \"8163cb1d-c1d1-48cd-8e45-6f239a7095c1\") " Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.587971 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aac33e2-e90f-4db8-95a4-b676416a6781-kube-api-access-sntxx" (OuterVolumeSpecName: "kube-api-access-sntxx") pod "3aac33e2-e90f-4db8-95a4-b676416a6781" (UID: "3aac33e2-e90f-4db8-95a4-b676416a6781"). InnerVolumeSpecName "kube-api-access-sntxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.588361 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8163cb1d-c1d1-48cd-8e45-6f239a7095c1-kube-api-access-6gk7g" (OuterVolumeSpecName: "kube-api-access-6gk7g") pod "8163cb1d-c1d1-48cd-8e45-6f239a7095c1" (UID: "8163cb1d-c1d1-48cd-8e45-6f239a7095c1"). InnerVolumeSpecName "kube-api-access-6gk7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.683778 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sntxx\" (UniqueName: \"kubernetes.io/projected/3aac33e2-e90f-4db8-95a4-b676416a6781-kube-api-access-sntxx\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.683819 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gk7g\" (UniqueName: \"kubernetes.io/projected/8163cb1d-c1d1-48cd-8e45-6f239a7095c1-kube-api-access-6gk7g\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.928441 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.928907 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" containerName="glance-log" containerID="cri-o://29221342824c12498a2445acd6150e30aae7f82799dfded43e139a3d552e1f47" gracePeriod=30 Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.929411 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" containerName="glance-httpd" containerID="cri-o://4c27190f8b3f02d64f090187bca72a8c38246d3cfc9366619beef07e151a043b" gracePeriod=30 Oct 01 10:34:17 crc kubenswrapper[4735]: I1001 10:34:17.954740 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": EOF" Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.041624 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-798b4f9b87-frx5r" Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.103333 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-94689896b-kwvtc"] Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.103654 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-94689896b-kwvtc" podUID="5f2b4d4c-d741-4a88-b71c-fda5946d3896" containerName="neutron-api" containerID="cri-o://4c2bb6ac267ef9269e11d07112b91da398af5bddd4295e35e8730117353b16f4" gracePeriod=30 Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.103719 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-94689896b-kwvtc" podUID="5f2b4d4c-d741-4a88-b71c-fda5946d3896" containerName="neutron-httpd" containerID="cri-o://efb380d7acdfb188019f59c7ffb0a953d8a3325c7e5425bdcf6f172f779bd4ad" gracePeriod=30 Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.351175 4735 generic.go:334] "Generic (PLEG): container finished" podID="541784dc-4146-459d-bee0-2f97d22a7977" containerID="5688c7fc60494353be2b6ebfcefc98759a71666e60aa632e6319fbe57301999d" exitCode=0 Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.351256 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8cvdn" event={"ID":"541784dc-4146-459d-bee0-2f97d22a7977","Type":"ContainerDied","Data":"5688c7fc60494353be2b6ebfcefc98759a71666e60aa632e6319fbe57301999d"} Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.354794 4735 generic.go:334] "Generic (PLEG): container finished" podID="5f2b4d4c-d741-4a88-b71c-fda5946d3896" containerID="efb380d7acdfb188019f59c7ffb0a953d8a3325c7e5425bdcf6f172f779bd4ad" exitCode=0 Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.354831 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94689896b-kwvtc" event={"ID":"5f2b4d4c-d741-4a88-b71c-fda5946d3896","Type":"ContainerDied","Data":"efb380d7acdfb188019f59c7ffb0a953d8a3325c7e5425bdcf6f172f779bd4ad"} Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.356222 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5nzjx" event={"ID":"8163cb1d-c1d1-48cd-8e45-6f239a7095c1","Type":"ContainerDied","Data":"3269cce07dda2258ea0def6cf3cd8ad7a91c084b2cce31f5faa71c4dc1d3dcd5"} Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.356240 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5nzjx" Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.356260 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3269cce07dda2258ea0def6cf3cd8ad7a91c084b2cce31f5faa71c4dc1d3dcd5" Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.358329 4735 generic.go:334] "Generic (PLEG): container finished" podID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" containerID="29221342824c12498a2445acd6150e30aae7f82799dfded43e139a3d552e1f47" exitCode=143 Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.358382 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4","Type":"ContainerDied","Data":"29221342824c12498a2445acd6150e30aae7f82799dfded43e139a3d552e1f47"} Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.361938 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zjsng" event={"ID":"3aac33e2-e90f-4db8-95a4-b676416a6781","Type":"ContainerDied","Data":"39e51bbb806122852d4261f3ac6b7dec1b6c4165ee8ead257177c181562050c0"} Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.361967 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39e51bbb806122852d4261f3ac6b7dec1b6c4165ee8ead257177c181562050c0" Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.362078 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zjsng" Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.947938 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.948243 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4c4185de-864c-4329-a2b2-bc7164bb33c6" containerName="glance-log" containerID="cri-o://0f7a8b38e554a2525bda66d4fb06f3bedaae8ede449744a03ede59dbc2b3eabe" gracePeriod=30 Oct 01 10:34:18 crc kubenswrapper[4735]: I1001 10:34:18.948509 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4c4185de-864c-4329-a2b2-bc7164bb33c6" containerName="glance-httpd" containerID="cri-o://0d695d3e4c0af08383ebf43a9745bc57abd3371c82ea2546282bd7cbbcce367d" gracePeriod=30 Oct 01 10:34:19 crc kubenswrapper[4735]: I1001 10:34:19.376403 4735 generic.go:334] "Generic (PLEG): container finished" podID="4c4185de-864c-4329-a2b2-bc7164bb33c6" containerID="0f7a8b38e554a2525bda66d4fb06f3bedaae8ede449744a03ede59dbc2b3eabe" exitCode=143 Oct 01 10:34:19 crc kubenswrapper[4735]: I1001 10:34:19.376475 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c4185de-864c-4329-a2b2-bc7164bb33c6","Type":"ContainerDied","Data":"0f7a8b38e554a2525bda66d4fb06f3bedaae8ede449744a03ede59dbc2b3eabe"} Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.282452 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3a5f-account-create-2knz5"] Oct 01 10:34:21 crc kubenswrapper[4735]: E1001 10:34:21.283262 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6039fc3-1cee-4902-81bd-0f35cf2eaa96" containerName="mariadb-database-create" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.283275 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6039fc3-1cee-4902-81bd-0f35cf2eaa96" containerName="mariadb-database-create" Oct 01 10:34:21 crc kubenswrapper[4735]: E1001 10:34:21.283310 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8163cb1d-c1d1-48cd-8e45-6f239a7095c1" containerName="mariadb-database-create" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.283316 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8163cb1d-c1d1-48cd-8e45-6f239a7095c1" containerName="mariadb-database-create" Oct 01 10:34:21 crc kubenswrapper[4735]: E1001 10:34:21.283330 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aac33e2-e90f-4db8-95a4-b676416a6781" containerName="mariadb-database-create" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.283337 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aac33e2-e90f-4db8-95a4-b676416a6781" containerName="mariadb-database-create" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.283507 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aac33e2-e90f-4db8-95a4-b676416a6781" containerName="mariadb-database-create" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.283560 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6039fc3-1cee-4902-81bd-0f35cf2eaa96" containerName="mariadb-database-create" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.283568 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8163cb1d-c1d1-48cd-8e45-6f239a7095c1" containerName="mariadb-database-create" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.284178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3a5f-account-create-2knz5" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.286472 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.294485 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3a5f-account-create-2knz5"] Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.355049 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwhgh\" (UniqueName: \"kubernetes.io/projected/01d783e4-455a-4e50-a5a6-ca1a289b2ad7-kube-api-access-cwhgh\") pod \"nova-api-3a5f-account-create-2knz5\" (UID: \"01d783e4-455a-4e50-a5a6-ca1a289b2ad7\") " pod="openstack/nova-api-3a5f-account-create-2knz5" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.397144 4735 generic.go:334] "Generic (PLEG): container finished" podID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" containerID="4c27190f8b3f02d64f090187bca72a8c38246d3cfc9366619beef07e151a043b" exitCode=0 Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.397184 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4","Type":"ContainerDied","Data":"4c27190f8b3f02d64f090187bca72a8c38246d3cfc9366619beef07e151a043b"} Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.458433 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwhgh\" (UniqueName: \"kubernetes.io/projected/01d783e4-455a-4e50-a5a6-ca1a289b2ad7-kube-api-access-cwhgh\") pod \"nova-api-3a5f-account-create-2knz5\" (UID: \"01d783e4-455a-4e50-a5a6-ca1a289b2ad7\") " pod="openstack/nova-api-3a5f-account-create-2knz5" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.468533 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ea66-account-create-hsfw4"] Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.470061 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea66-account-create-hsfw4" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.473125 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.477202 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ea66-account-create-hsfw4"] Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.478229 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwhgh\" (UniqueName: \"kubernetes.io/projected/01d783e4-455a-4e50-a5a6-ca1a289b2ad7-kube-api-access-cwhgh\") pod \"nova-api-3a5f-account-create-2knz5\" (UID: \"01d783e4-455a-4e50-a5a6-ca1a289b2ad7\") " pod="openstack/nova-api-3a5f-account-create-2knz5" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.560025 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmlgm\" (UniqueName: \"kubernetes.io/projected/5f1489d8-cd50-4c5f-b6f8-22854e49b246-kube-api-access-tmlgm\") pod \"nova-cell0-ea66-account-create-hsfw4\" (UID: \"5f1489d8-cd50-4c5f-b6f8-22854e49b246\") " pod="openstack/nova-cell0-ea66-account-create-hsfw4" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.646945 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3a5f-account-create-2knz5" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.661429 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmlgm\" (UniqueName: \"kubernetes.io/projected/5f1489d8-cd50-4c5f-b6f8-22854e49b246-kube-api-access-tmlgm\") pod \"nova-cell0-ea66-account-create-hsfw4\" (UID: \"5f1489d8-cd50-4c5f-b6f8-22854e49b246\") " pod="openstack/nova-cell0-ea66-account-create-hsfw4" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.665142 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cdcc-account-create-5vjp4"] Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.666173 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cdcc-account-create-5vjp4" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.668326 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.678950 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmlgm\" (UniqueName: \"kubernetes.io/projected/5f1489d8-cd50-4c5f-b6f8-22854e49b246-kube-api-access-tmlgm\") pod \"nova-cell0-ea66-account-create-hsfw4\" (UID: \"5f1489d8-cd50-4c5f-b6f8-22854e49b246\") " pod="openstack/nova-cell0-ea66-account-create-hsfw4" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.690286 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cdcc-account-create-5vjp4"] Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.763706 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxtsr\" (UniqueName: \"kubernetes.io/projected/7258162b-c9ac-49b4-a0ef-ec5f19491efa-kube-api-access-dxtsr\") pod \"nova-cell1-cdcc-account-create-5vjp4\" (UID: \"7258162b-c9ac-49b4-a0ef-ec5f19491efa\") " pod="openstack/nova-cell1-cdcc-account-create-5vjp4" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.821176 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea66-account-create-hsfw4" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.865901 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxtsr\" (UniqueName: \"kubernetes.io/projected/7258162b-c9ac-49b4-a0ef-ec5f19491efa-kube-api-access-dxtsr\") pod \"nova-cell1-cdcc-account-create-5vjp4\" (UID: \"7258162b-c9ac-49b4-a0ef-ec5f19491efa\") " pod="openstack/nova-cell1-cdcc-account-create-5vjp4" Oct 01 10:34:21 crc kubenswrapper[4735]: I1001 10:34:21.902375 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxtsr\" (UniqueName: \"kubernetes.io/projected/7258162b-c9ac-49b4-a0ef-ec5f19491efa-kube-api-access-dxtsr\") pod \"nova-cell1-cdcc-account-create-5vjp4\" (UID: \"7258162b-c9ac-49b4-a0ef-ec5f19491efa\") " pod="openstack/nova-cell1-cdcc-account-create-5vjp4" Oct 01 10:34:22 crc kubenswrapper[4735]: I1001 10:34:22.006283 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:22 crc kubenswrapper[4735]: I1001 10:34:22.008693 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-696cd688cf-kqrbf" Oct 01 10:34:22 crc kubenswrapper[4735]: I1001 10:34:22.025945 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cdcc-account-create-5vjp4" Oct 01 10:34:22 crc kubenswrapper[4735]: I1001 10:34:22.405444 4735 generic.go:334] "Generic (PLEG): container finished" podID="4c4185de-864c-4329-a2b2-bc7164bb33c6" containerID="0d695d3e4c0af08383ebf43a9745bc57abd3371c82ea2546282bd7cbbcce367d" exitCode=0 Oct 01 10:34:22 crc kubenswrapper[4735]: I1001 10:34:22.406239 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c4185de-864c-4329-a2b2-bc7164bb33c6","Type":"ContainerDied","Data":"0d695d3e4c0af08383ebf43a9745bc57abd3371c82ea2546282bd7cbbcce367d"} Oct 01 10:34:22 crc kubenswrapper[4735]: I1001 10:34:22.544906 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bcff764fb-c7nmm" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 01 10:34:23 crc kubenswrapper[4735]: E1001 10:34:23.308185 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Oct 01 10:34:23 crc kubenswrapper[4735]: E1001 10:34:23.308791 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bbxhx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(8506c575-4d5d-4691-9ebd-75d8c878f9a9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 10:34:23 crc kubenswrapper[4735]: E1001 10:34:23.309948 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.427210 4735 generic.go:334] "Generic (PLEG): container finished" podID="5f2b4d4c-d741-4a88-b71c-fda5946d3896" containerID="4c2bb6ac267ef9269e11d07112b91da398af5bddd4295e35e8730117353b16f4" exitCode=0 Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.427336 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94689896b-kwvtc" event={"ID":"5f2b4d4c-d741-4a88-b71c-fda5946d3896","Type":"ContainerDied","Data":"4c2bb6ac267ef9269e11d07112b91da398af5bddd4295e35e8730117353b16f4"} Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.433087 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8cvdn" event={"ID":"541784dc-4146-459d-bee0-2f97d22a7977","Type":"ContainerDied","Data":"8106cc853f9d1653f43b0d0e356c1e0c215d1019ae59972d6aba6a1c686af251"} Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.433123 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8106cc853f9d1653f43b0d0e356c1e0c215d1019ae59972d6aba6a1c686af251" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.435118 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.443309 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bcff764fb-c7nmm" event={"ID":"45ed423f-4895-4df3-9a04-2b916f38f57d","Type":"ContainerDied","Data":"1722000fc5c11369090132cb41a4718d023583ec4f6678d0c5f1663aa4f1afc4"} Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.443344 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1722000fc5c11369090132cb41a4718d023583ec4f6678d0c5f1663aa4f1afc4" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.443390 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="ceilometer-central-agent" containerID="cri-o://f2fd5fba0b761cf62bfc989e53081dd9793c5448aae946a24d57446b23d5f2fb" gracePeriod=30 Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.443427 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="ceilometer-notification-agent" containerID="cri-o://f16d05092237210f009ea9104512444793fc9c41bc32a45a487845d6d6f77605" gracePeriod=30 Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.443401 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="sg-core" containerID="cri-o://77bae45e0a3516c6625bd67820d03f8c1de32b8759457e0fb94b7e4e62b09187" gracePeriod=30 Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.448785 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.492731 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-scripts\") pod \"45ed423f-4895-4df3-9a04-2b916f38f57d\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.492786 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-db-sync-config-data\") pod \"541784dc-4146-459d-bee0-2f97d22a7977\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.492837 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-tls-certs\") pod \"45ed423f-4895-4df3-9a04-2b916f38f57d\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.492879 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45ed423f-4895-4df3-9a04-2b916f38f57d-logs\") pod \"45ed423f-4895-4df3-9a04-2b916f38f57d\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.492912 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-combined-ca-bundle\") pod \"541784dc-4146-459d-bee0-2f97d22a7977\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.492977 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-combined-ca-bundle\") pod \"45ed423f-4895-4df3-9a04-2b916f38f57d\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.493048 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-config-data\") pod \"45ed423f-4895-4df3-9a04-2b916f38f57d\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.493102 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcl52\" (UniqueName: \"kubernetes.io/projected/45ed423f-4895-4df3-9a04-2b916f38f57d-kube-api-access-fcl52\") pod \"45ed423f-4895-4df3-9a04-2b916f38f57d\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.493118 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-secret-key\") pod \"45ed423f-4895-4df3-9a04-2b916f38f57d\" (UID: \"45ed423f-4895-4df3-9a04-2b916f38f57d\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.493163 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg2vn\" (UniqueName: \"kubernetes.io/projected/541784dc-4146-459d-bee0-2f97d22a7977-kube-api-access-rg2vn\") pod \"541784dc-4146-459d-bee0-2f97d22a7977\" (UID: \"541784dc-4146-459d-bee0-2f97d22a7977\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.496477 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ed423f-4895-4df3-9a04-2b916f38f57d-logs" (OuterVolumeSpecName: "logs") pod "45ed423f-4895-4df3-9a04-2b916f38f57d" (UID: "45ed423f-4895-4df3-9a04-2b916f38f57d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.507472 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "541784dc-4146-459d-bee0-2f97d22a7977" (UID: "541784dc-4146-459d-bee0-2f97d22a7977"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.508799 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "45ed423f-4895-4df3-9a04-2b916f38f57d" (UID: "45ed423f-4895-4df3-9a04-2b916f38f57d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.510764 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541784dc-4146-459d-bee0-2f97d22a7977-kube-api-access-rg2vn" (OuterVolumeSpecName: "kube-api-access-rg2vn") pod "541784dc-4146-459d-bee0-2f97d22a7977" (UID: "541784dc-4146-459d-bee0-2f97d22a7977"). InnerVolumeSpecName "kube-api-access-rg2vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.516165 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ed423f-4895-4df3-9a04-2b916f38f57d-kube-api-access-fcl52" (OuterVolumeSpecName: "kube-api-access-fcl52") pod "45ed423f-4895-4df3-9a04-2b916f38f57d" (UID: "45ed423f-4895-4df3-9a04-2b916f38f57d"). InnerVolumeSpecName "kube-api-access-fcl52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.533298 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-scripts" (OuterVolumeSpecName: "scripts") pod "45ed423f-4895-4df3-9a04-2b916f38f57d" (UID: "45ed423f-4895-4df3-9a04-2b916f38f57d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.536356 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45ed423f-4895-4df3-9a04-2b916f38f57d" (UID: "45ed423f-4895-4df3-9a04-2b916f38f57d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.540469 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "541784dc-4146-459d-bee0-2f97d22a7977" (UID: "541784dc-4146-459d-bee0-2f97d22a7977"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.554784 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-config-data" (OuterVolumeSpecName: "config-data") pod "45ed423f-4895-4df3-9a04-2b916f38f57d" (UID: "45ed423f-4895-4df3-9a04-2b916f38f57d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.578381 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "45ed423f-4895-4df3-9a04-2b916f38f57d" (UID: "45ed423f-4895-4df3-9a04-2b916f38f57d"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.595560 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg2vn\" (UniqueName: \"kubernetes.io/projected/541784dc-4146-459d-bee0-2f97d22a7977-kube-api-access-rg2vn\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.595607 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.595618 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.595627 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.595635 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45ed423f-4895-4df3-9a04-2b916f38f57d-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.595643 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541784dc-4146-459d-bee0-2f97d22a7977-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.595651 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.595659 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45ed423f-4895-4df3-9a04-2b916f38f57d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.595667 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcl52\" (UniqueName: \"kubernetes.io/projected/45ed423f-4895-4df3-9a04-2b916f38f57d-kube-api-access-fcl52\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.595676 4735 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/45ed423f-4895-4df3-9a04-2b916f38f57d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.697817 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.772235 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.801893 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-logs\") pod \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.801941 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-scripts\") pod \"4c4185de-864c-4329-a2b2-bc7164bb33c6\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802052 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802111 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-internal-tls-certs\") pod \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802138 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-scripts\") pod \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802295 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-config-data\") pod \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802322 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-logs\") pod \"4c4185de-864c-4329-a2b2-bc7164bb33c6\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802336 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"4c4185de-864c-4329-a2b2-bc7164bb33c6\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802368 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-combined-ca-bundle\") pod \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802384 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-config-data\") pod \"4c4185de-864c-4329-a2b2-bc7164bb33c6\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802400 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-httpd-run\") pod \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802424 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-public-tls-certs\") pod \"4c4185de-864c-4329-a2b2-bc7164bb33c6\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802449 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx2z7\" (UniqueName: \"kubernetes.io/projected/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-kube-api-access-fx2z7\") pod \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\" (UID: \"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802469 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-httpd-run\") pod \"4c4185de-864c-4329-a2b2-bc7164bb33c6\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802503 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-combined-ca-bundle\") pod \"4c4185de-864c-4329-a2b2-bc7164bb33c6\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.802525 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6qpd\" (UniqueName: \"kubernetes.io/projected/4c4185de-864c-4329-a2b2-bc7164bb33c6-kube-api-access-x6qpd\") pod \"4c4185de-864c-4329-a2b2-bc7164bb33c6\" (UID: \"4c4185de-864c-4329-a2b2-bc7164bb33c6\") " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.807854 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "4c4185de-864c-4329-a2b2-bc7164bb33c6" (UID: "4c4185de-864c-4329-a2b2-bc7164bb33c6"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.808198 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-logs" (OuterVolumeSpecName: "logs") pod "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" (UID: "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.808526 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4c4185de-864c-4329-a2b2-bc7164bb33c6" (UID: "4c4185de-864c-4329-a2b2-bc7164bb33c6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.808737 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" (UID: "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.808937 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-logs" (OuterVolumeSpecName: "logs") pod "4c4185de-864c-4329-a2b2-bc7164bb33c6" (UID: "4c4185de-864c-4329-a2b2-bc7164bb33c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.817982 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-scripts" (OuterVolumeSpecName: "scripts") pod "4c4185de-864c-4329-a2b2-bc7164bb33c6" (UID: "4c4185de-864c-4329-a2b2-bc7164bb33c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.818023 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c4185de-864c-4329-a2b2-bc7164bb33c6-kube-api-access-x6qpd" (OuterVolumeSpecName: "kube-api-access-x6qpd") pod "4c4185de-864c-4329-a2b2-bc7164bb33c6" (UID: "4c4185de-864c-4329-a2b2-bc7164bb33c6"). InnerVolumeSpecName "kube-api-access-x6qpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.818055 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-scripts" (OuterVolumeSpecName: "scripts") pod "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" (UID: "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.819117 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-kube-api-access-fx2z7" (OuterVolumeSpecName: "kube-api-access-fx2z7") pod "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" (UID: "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4"). InnerVolumeSpecName "kube-api-access-fx2z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.819796 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" (UID: "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.907389 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.907653 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.907676 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.907686 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.907695 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx2z7\" (UniqueName: \"kubernetes.io/projected/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-kube-api-access-fx2z7\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.907706 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c4185de-864c-4329-a2b2-bc7164bb33c6-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.907720 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6qpd\" (UniqueName: \"kubernetes.io/projected/4c4185de-864c-4329-a2b2-bc7164bb33c6-kube-api-access-x6qpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.907731 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.907740 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.907754 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.916003 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-config-data" (OuterVolumeSpecName: "config-data") pod "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" (UID: "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.922855 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" (UID: "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.932309 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" (UID: "de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.933243 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c4185de-864c-4329-a2b2-bc7164bb33c6" (UID: "4c4185de-864c-4329-a2b2-bc7164bb33c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:23 crc kubenswrapper[4735]: I1001 10:34:23.936244 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-config-data" (OuterVolumeSpecName: "config-data") pod "4c4185de-864c-4329-a2b2-bc7164bb33c6" (UID: "4c4185de-864c-4329-a2b2-bc7164bb33c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.013114 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.017735 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.017769 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.017782 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.017793 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.017804 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.017816 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.023987 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.048920 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4c4185de-864c-4329-a2b2-bc7164bb33c6" (UID: "4c4185de-864c-4329-a2b2-bc7164bb33c6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.055518 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ea66-account-create-hsfw4"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.063103 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cdcc-account-create-5vjp4"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.067280 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 01 10:34:24 crc kubenswrapper[4735]: W1001 10:34:24.067308 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01d783e4_455a_4e50_a5a6_ca1a289b2ad7.slice/crio-f5ee0f9d59463794dd4c83a26f5f37b9ed2b15d69d3291cc6189268f11fe07be WatchSource:0}: Error finding container f5ee0f9d59463794dd4c83a26f5f37b9ed2b15d69d3291cc6189268f11fe07be: Status 404 returned error can't find the container with id f5ee0f9d59463794dd4c83a26f5f37b9ed2b15d69d3291cc6189268f11fe07be Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.069554 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.070651 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3a5f-account-create-2knz5"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.075293 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.119314 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.119357 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4185de-864c-4329-a2b2-bc7164bb33c6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.454896 4735 generic.go:334] "Generic (PLEG): container finished" podID="7258162b-c9ac-49b4-a0ef-ec5f19491efa" containerID="e21226580f7cea2f4269330f076615a590679fab57d62f8c9b421f201fbd3e93" exitCode=0 Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.454993 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cdcc-account-create-5vjp4" event={"ID":"7258162b-c9ac-49b4-a0ef-ec5f19491efa","Type":"ContainerDied","Data":"e21226580f7cea2f4269330f076615a590679fab57d62f8c9b421f201fbd3e93"} Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.455033 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cdcc-account-create-5vjp4" event={"ID":"7258162b-c9ac-49b4-a0ef-ec5f19491efa","Type":"ContainerStarted","Data":"464e6c5a559f6d8acc5cc32967a108929e7d0a282bb6bdf134f5974124fbb44e"} Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.457717 4735 generic.go:334] "Generic (PLEG): container finished" podID="01d783e4-455a-4e50-a5a6-ca1a289b2ad7" containerID="96e7915ab4bdb82f58a4c39d45fd16e211a2836ed3782e7709840e491a5e735c" exitCode=0 Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.457774 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3a5f-account-create-2knz5" event={"ID":"01d783e4-455a-4e50-a5a6-ca1a289b2ad7","Type":"ContainerDied","Data":"96e7915ab4bdb82f58a4c39d45fd16e211a2836ed3782e7709840e491a5e735c"} Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.457796 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3a5f-account-create-2knz5" event={"ID":"01d783e4-455a-4e50-a5a6-ca1a289b2ad7","Type":"ContainerStarted","Data":"f5ee0f9d59463794dd4c83a26f5f37b9ed2b15d69d3291cc6189268f11fe07be"} Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.460124 4735 generic.go:334] "Generic (PLEG): container finished" podID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerID="77bae45e0a3516c6625bd67820d03f8c1de32b8759457e0fb94b7e4e62b09187" exitCode=2 Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.460160 4735 generic.go:334] "Generic (PLEG): container finished" podID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerID="f2fd5fba0b761cf62bfc989e53081dd9793c5448aae946a24d57446b23d5f2fb" exitCode=0 Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.460233 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8506c575-4d5d-4691-9ebd-75d8c878f9a9","Type":"ContainerDied","Data":"77bae45e0a3516c6625bd67820d03f8c1de32b8759457e0fb94b7e4e62b09187"} Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.460267 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8506c575-4d5d-4691-9ebd-75d8c878f9a9","Type":"ContainerDied","Data":"f2fd5fba0b761cf62bfc989e53081dd9793c5448aae946a24d57446b23d5f2fb"} Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.462242 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5","Type":"ContainerStarted","Data":"c513358e8fad7bbb769bfffe0d529b60d80c88d3c5adffc40c3b08bf2fa5cbaf"} Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.464504 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c4185de-864c-4329-a2b2-bc7164bb33c6","Type":"ContainerDied","Data":"200e953a57027f5f67060a357e5c0f987f708b621186cb468ef16b72ae86efd1"} Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.464678 4735 scope.go:117] "RemoveContainer" containerID="0d695d3e4c0af08383ebf43a9745bc57abd3371c82ea2546282bd7cbbcce367d" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.464548 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.470941 4735 generic.go:334] "Generic (PLEG): container finished" podID="5f1489d8-cd50-4c5f-b6f8-22854e49b246" containerID="ab4a2a9e23a314bf28665e4af012b73eccb4d07fb6eda7f8f4e44048c19f2b90" exitCode=0 Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.471005 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea66-account-create-hsfw4" event={"ID":"5f1489d8-cd50-4c5f-b6f8-22854e49b246","Type":"ContainerDied","Data":"ab4a2a9e23a314bf28665e4af012b73eccb4d07fb6eda7f8f4e44048c19f2b90"} Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.471232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea66-account-create-hsfw4" event={"ID":"5f1489d8-cd50-4c5f-b6f8-22854e49b246","Type":"ContainerStarted","Data":"799e83d6e5f2bf3f0a6074621a97c4de6f3e58994c3112d778a802f3b7fadb20"} Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.474319 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8cvdn" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.474580 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.475266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4","Type":"ContainerDied","Data":"a2bcce01642bc2a310aabc8a548b7fabe099230a2ef08a892328d57e415f1b83"} Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.475549 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bcff764fb-c7nmm" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.501314 4735 scope.go:117] "RemoveContainer" containerID="0f7a8b38e554a2525bda66d4fb06f3bedaae8ede449744a03ede59dbc2b3eabe" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.504463 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=12.787685565 podStartE2EDuration="23.504443522s" podCreationTimestamp="2025-10-01 10:34:01 +0000 UTC" firstStartedPulling="2025-10-01 10:34:12.777570557 +0000 UTC m=+1011.470391819" lastFinishedPulling="2025-10-01 10:34:23.494328504 +0000 UTC m=+1022.187149776" observedRunningTime="2025-10-01 10:34:24.498989697 +0000 UTC m=+1023.191810959" watchObservedRunningTime="2025-10-01 10:34:24.504443522 +0000 UTC m=+1023.197264784" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.523331 4735 scope.go:117] "RemoveContainer" containerID="4c27190f8b3f02d64f090187bca72a8c38246d3cfc9366619beef07e151a043b" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.547867 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.558540 4735 scope.go:117] "RemoveContainer" containerID="29221342824c12498a2445acd6150e30aae7f82799dfded43e139a3d552e1f47" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.569563 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.585077 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bcff764fb-c7nmm"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.607485 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7bcff764fb-c7nmm"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624256 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:34:24 crc kubenswrapper[4735]: E1001 10:34:24.624607 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" containerName="glance-httpd" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624619 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" containerName="glance-httpd" Oct 01 10:34:24 crc kubenswrapper[4735]: E1001 10:34:24.624629 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4185de-864c-4329-a2b2-bc7164bb33c6" containerName="glance-log" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624635 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4185de-864c-4329-a2b2-bc7164bb33c6" containerName="glance-log" Oct 01 10:34:24 crc kubenswrapper[4735]: E1001 10:34:24.624661 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon-log" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624667 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon-log" Oct 01 10:34:24 crc kubenswrapper[4735]: E1001 10:34:24.624682 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624688 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon" Oct 01 10:34:24 crc kubenswrapper[4735]: E1001 10:34:24.624697 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4185de-864c-4329-a2b2-bc7164bb33c6" containerName="glance-httpd" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624702 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4185de-864c-4329-a2b2-bc7164bb33c6" containerName="glance-httpd" Oct 01 10:34:24 crc kubenswrapper[4735]: E1001 10:34:24.624714 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541784dc-4146-459d-bee0-2f97d22a7977" containerName="barbican-db-sync" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624719 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="541784dc-4146-459d-bee0-2f97d22a7977" containerName="barbican-db-sync" Oct 01 10:34:24 crc kubenswrapper[4735]: E1001 10:34:24.624729 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" containerName="glance-log" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624735 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" containerName="glance-log" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624898 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon-log" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624909 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" containerName="glance-log" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624917 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="541784dc-4146-459d-bee0-2f97d22a7977" containerName="barbican-db-sync" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624925 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" containerName="horizon" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624940 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" containerName="glance-httpd" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624947 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4185de-864c-4329-a2b2-bc7164bb33c6" containerName="glance-httpd" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.624960 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4185de-864c-4329-a2b2-bc7164bb33c6" containerName="glance-log" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.625811 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.628888 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.628899 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.629136 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8zx2n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.629692 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.642479 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.670931 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.681186 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.689603 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.691078 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.693907 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.694256 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.700384 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.724233 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-55fd7c674f-kfd9n"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.726079 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.728046 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-config-data\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.728074 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkk5g\" (UniqueName: \"kubernetes.io/projected/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-kube-api-access-vkk5g\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.728102 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-config-data-custom\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.728129 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-combined-ca-bundle\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.728160 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-logs\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.729800 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.737198 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dcf4d" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.737380 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.756895 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-599c96b4d8-46mbs"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.758267 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.769626 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.798560 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-55fd7c674f-kfd9n"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.815117 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-599c96b4d8-46mbs"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829483 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829535 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53b4ae38-a993-4a83-93bf-da796d4be856-logs\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829565 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-scripts\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829584 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829614 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/971696de-af05-4d40-89d1-64a2688b08e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829632 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829663 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829680 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829704 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53b4ae38-a993-4a83-93bf-da796d4be856-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829735 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-config-data\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829751 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829768 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971696de-af05-4d40-89d1-64a2688b08e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829792 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-config-data\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829812 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkk5g\" (UniqueName: \"kubernetes.io/projected/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-kube-api-access-vkk5g\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829827 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829845 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkktp\" (UniqueName: \"kubernetes.io/projected/971696de-af05-4d40-89d1-64a2688b08e0-kube-api-access-qkktp\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829863 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829881 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-config-data-custom\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829911 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-combined-ca-bundle\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829927 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwlgx\" (UniqueName: \"kubernetes.io/projected/53b4ae38-a993-4a83-93bf-da796d4be856-kube-api-access-pwlgx\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.829992 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-logs\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.830419 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-logs\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.847004 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-combined-ca-bundle\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.847081 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-config-data-custom\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.864598 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-config-data\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.879094 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-667b6"] Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.904944 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:24 crc kubenswrapper[4735]: E1001 10:34:24.912655 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-84vvz" podUID="61dd2f37-7f60-42f5-a3d0-3b693d1e64be" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.919137 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkk5g\" (UniqueName: \"kubernetes.io/projected/39f45bdc-e25e-41f8-aefe-0dede18c4bb8-kube-api-access-vkk5g\") pod \"barbican-worker-55fd7c674f-kfd9n\" (UID: \"39f45bdc-e25e-41f8-aefe-0dede18c4bb8\") " pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.944406 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/971696de-af05-4d40-89d1-64a2688b08e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.944451 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.944483 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce28744-bfa2-4674-a495-02abf8245d38-combined-ca-bundle\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.944529 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.944549 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.944577 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53b4ae38-a993-4a83-93bf-da796d4be856-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.944613 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce28744-bfa2-4674-a495-02abf8245d38-logs\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.944633 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-config-data\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.946278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53b4ae38-a993-4a83-93bf-da796d4be856-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.947180 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.957156 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.957235 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971696de-af05-4d40-89d1-64a2688b08e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.957327 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.957358 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkktp\" (UniqueName: \"kubernetes.io/projected/971696de-af05-4d40-89d1-64a2688b08e0-kube-api-access-qkktp\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.957397 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.957518 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54t2j\" (UniqueName: \"kubernetes.io/projected/4ce28744-bfa2-4674-a495-02abf8245d38-kube-api-access-54t2j\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.957573 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwlgx\" (UniqueName: \"kubernetes.io/projected/53b4ae38-a993-4a83-93bf-da796d4be856-kube-api-access-pwlgx\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.957645 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ce28744-bfa2-4674-a495-02abf8245d38-config-data-custom\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.958184 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.958217 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53b4ae38-a993-4a83-93bf-da796d4be856-logs\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.958271 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-scripts\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.958295 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.958355 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce28744-bfa2-4674-a495-02abf8245d38-config-data\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.963060 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/971696de-af05-4d40-89d1-64a2688b08e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.963404 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.979893 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/971696de-af05-4d40-89d1-64a2688b08e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.980120 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53b4ae38-a993-4a83-93bf-da796d4be856-logs\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.985035 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-config-data\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:24 crc kubenswrapper[4735]: I1001 10:34:24.992349 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-667b6"] Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.002581 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.003783 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.004246 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.006599 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwlgx\" (UniqueName: \"kubernetes.io/projected/53b4ae38-a993-4a83-93bf-da796d4be856-kube-api-access-pwlgx\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.007202 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.007997 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.009855 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971696de-af05-4d40-89d1-64a2688b08e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.018058 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53b4ae38-a993-4a83-93bf-da796d4be856-scripts\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.032302 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkktp\" (UniqueName: \"kubernetes.io/projected/971696de-af05-4d40-89d1-64a2688b08e0-kube-api-access-qkktp\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.041751 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"53b4ae38-a993-4a83-93bf-da796d4be856\") " pod="openstack/glance-default-external-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.052983 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"971696de-af05-4d40-89d1-64a2688b08e0\") " pod="openstack/glance-default-internal-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.059869 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7fcd5cf54b-nqvjs"] Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.062136 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063426 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ce28744-bfa2-4674-a495-02abf8245d38-config-data-custom\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063577 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063604 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce28744-bfa2-4674-a495-02abf8245d38-config-data\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063633 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce28744-bfa2-4674-a495-02abf8245d38-combined-ca-bundle\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063661 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063684 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqs4z\" (UniqueName: \"kubernetes.io/projected/3fadfe66-81a6-4af1-9d8e-a65139585a11-kube-api-access-fqs4z\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063712 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063746 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce28744-bfa2-4674-a495-02abf8245d38-logs\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063777 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-config\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063805 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063842 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54t2j\" (UniqueName: \"kubernetes.io/projected/4ce28744-bfa2-4674-a495-02abf8245d38-kube-api-access-54t2j\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.063882 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.065122 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce28744-bfa2-4674-a495-02abf8245d38-logs\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.068217 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce28744-bfa2-4674-a495-02abf8245d38-combined-ca-bundle\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.078637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce28744-bfa2-4674-a495-02abf8245d38-config-data\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.080458 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ce28744-bfa2-4674-a495-02abf8245d38-config-data-custom\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.086992 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-55fd7c674f-kfd9n" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.091128 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54t2j\" (UniqueName: \"kubernetes.io/projected/4ce28744-bfa2-4674-a495-02abf8245d38-kube-api-access-54t2j\") pod \"barbican-keystone-listener-599c96b4d8-46mbs\" (UID: \"4ce28744-bfa2-4674-a495-02abf8245d38\") " pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.093643 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7fcd5cf54b-nqvjs"] Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.096847 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.164820 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-config\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.164977 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hkc9\" (UniqueName: \"kubernetes.io/projected/a7e968c0-6535-481f-8583-7fdb64a2a42a-kube-api-access-8hkc9\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.165107 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.165290 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-combined-ca-bundle\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.165434 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7e968c0-6535-481f-8583-7fdb64a2a42a-logs\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.165511 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.165676 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.165709 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.165789 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqs4z\" (UniqueName: \"kubernetes.io/projected/3fadfe66-81a6-4af1-9d8e-a65139585a11-kube-api-access-fqs4z\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.165870 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.165947 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-config\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.166852 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.167207 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.167278 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data-custom\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.167852 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.167870 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.190931 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqs4z\" (UniqueName: \"kubernetes.io/projected/3fadfe66-81a6-4af1-9d8e-a65139585a11-kube-api-access-fqs4z\") pod \"dnsmasq-dns-75c8ddd69c-667b6\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.251753 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.268839 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dwvk\" (UniqueName: \"kubernetes.io/projected/5f2b4d4c-d741-4a88-b71c-fda5946d3896-kube-api-access-9dwvk\") pod \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.268918 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-config\") pod \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.269027 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-ovndb-tls-certs\") pod \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.269130 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-httpd-config\") pod \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.269153 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-combined-ca-bundle\") pod \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\" (UID: \"5f2b4d4c-d741-4a88-b71c-fda5946d3896\") " Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.269392 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data-custom\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.269432 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hkc9\" (UniqueName: \"kubernetes.io/projected/a7e968c0-6535-481f-8583-7fdb64a2a42a-kube-api-access-8hkc9\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.269515 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-combined-ca-bundle\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.269551 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7e968c0-6535-481f-8583-7fdb64a2a42a-logs\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.269613 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.271247 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7e968c0-6535-481f-8583-7fdb64a2a42a-logs\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.278276 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5f2b4d4c-d741-4a88-b71c-fda5946d3896" (UID: "5f2b4d4c-d741-4a88-b71c-fda5946d3896"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.279576 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.282643 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f2b4d4c-d741-4a88-b71c-fda5946d3896-kube-api-access-9dwvk" (OuterVolumeSpecName: "kube-api-access-9dwvk") pod "5f2b4d4c-d741-4a88-b71c-fda5946d3896" (UID: "5f2b4d4c-d741-4a88-b71c-fda5946d3896"). InnerVolumeSpecName "kube-api-access-9dwvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.286387 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-combined-ca-bundle\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.287623 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data-custom\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.294346 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hkc9\" (UniqueName: \"kubernetes.io/projected/a7e968c0-6535-481f-8583-7fdb64a2a42a-kube-api-access-8hkc9\") pod \"barbican-api-7fcd5cf54b-nqvjs\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.347065 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.355312 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-config" (OuterVolumeSpecName: "config") pod "5f2b4d4c-d741-4a88-b71c-fda5946d3896" (UID: "5f2b4d4c-d741-4a88-b71c-fda5946d3896"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.365638 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f2b4d4c-d741-4a88-b71c-fda5946d3896" (UID: "5f2b4d4c-d741-4a88-b71c-fda5946d3896"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.371507 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dwvk\" (UniqueName: \"kubernetes.io/projected/5f2b4d4c-d741-4a88-b71c-fda5946d3896-kube-api-access-9dwvk\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.371530 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.371539 4735 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.371548 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.380318 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5f2b4d4c-d741-4a88-b71c-fda5946d3896" (UID: "5f2b4d4c-d741-4a88-b71c-fda5946d3896"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.388513 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.412308 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.422805 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.474840 4735 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f2b4d4c-d741-4a88-b71c-fda5946d3896-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.502797 4735 generic.go:334] "Generic (PLEG): container finished" podID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerID="f16d05092237210f009ea9104512444793fc9c41bc32a45a487845d6d6f77605" exitCode=0 Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.502854 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8506c575-4d5d-4691-9ebd-75d8c878f9a9","Type":"ContainerDied","Data":"f16d05092237210f009ea9104512444793fc9c41bc32a45a487845d6d6f77605"} Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.511374 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-94689896b-kwvtc" event={"ID":"5f2b4d4c-d741-4a88-b71c-fda5946d3896","Type":"ContainerDied","Data":"6a0bd98ea2644432043b0e5f08007f1b0d5005801036a0d0e70b112faf6209eb"} Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.511416 4735 scope.go:117] "RemoveContainer" containerID="efb380d7acdfb188019f59c7ffb0a953d8a3325c7e5425bdcf6f172f779bd4ad" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.511572 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-94689896b-kwvtc" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.554951 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-55fd7c674f-kfd9n"] Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.563169 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-94689896b-kwvtc"] Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.575719 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-94689896b-kwvtc"] Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.584978 4735 scope.go:117] "RemoveContainer" containerID="4c2bb6ac267ef9269e11d07112b91da398af5bddd4295e35e8730117353b16f4" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.682791 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.907120 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ed423f-4895-4df3-9a04-2b916f38f57d" path="/var/lib/kubelet/pods/45ed423f-4895-4df3-9a04-2b916f38f57d/volumes" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.907750 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c4185de-864c-4329-a2b2-bc7164bb33c6" path="/var/lib/kubelet/pods/4c4185de-864c-4329-a2b2-bc7164bb33c6/volumes" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.908414 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f2b4d4c-d741-4a88-b71c-fda5946d3896" path="/var/lib/kubelet/pods/5f2b4d4c-d741-4a88-b71c-fda5946d3896/volumes" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.909451 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4" path="/var/lib/kubelet/pods/de8a0fb9-bd5f-4a80-83dd-aa2eda4fc1d4/volumes" Oct 01 10:34:25 crc kubenswrapper[4735]: I1001 10:34:25.996435 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.211658 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cdcc-account-create-5vjp4" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.247830 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea66-account-create-hsfw4" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.261399 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3a5f-account-create-2knz5" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.397198 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmlgm\" (UniqueName: \"kubernetes.io/projected/5f1489d8-cd50-4c5f-b6f8-22854e49b246-kube-api-access-tmlgm\") pod \"5f1489d8-cd50-4c5f-b6f8-22854e49b246\" (UID: \"5f1489d8-cd50-4c5f-b6f8-22854e49b246\") " Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.397388 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwhgh\" (UniqueName: \"kubernetes.io/projected/01d783e4-455a-4e50-a5a6-ca1a289b2ad7-kube-api-access-cwhgh\") pod \"01d783e4-455a-4e50-a5a6-ca1a289b2ad7\" (UID: \"01d783e4-455a-4e50-a5a6-ca1a289b2ad7\") " Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.397422 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxtsr\" (UniqueName: \"kubernetes.io/projected/7258162b-c9ac-49b4-a0ef-ec5f19491efa-kube-api-access-dxtsr\") pod \"7258162b-c9ac-49b4-a0ef-ec5f19491efa\" (UID: \"7258162b-c9ac-49b4-a0ef-ec5f19491efa\") " Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.408724 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7258162b-c9ac-49b4-a0ef-ec5f19491efa-kube-api-access-dxtsr" (OuterVolumeSpecName: "kube-api-access-dxtsr") pod "7258162b-c9ac-49b4-a0ef-ec5f19491efa" (UID: "7258162b-c9ac-49b4-a0ef-ec5f19491efa"). InnerVolumeSpecName "kube-api-access-dxtsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.412981 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1489d8-cd50-4c5f-b6f8-22854e49b246-kube-api-access-tmlgm" (OuterVolumeSpecName: "kube-api-access-tmlgm") pod "5f1489d8-cd50-4c5f-b6f8-22854e49b246" (UID: "5f1489d8-cd50-4c5f-b6f8-22854e49b246"). InnerVolumeSpecName "kube-api-access-tmlgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.416765 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d783e4-455a-4e50-a5a6-ca1a289b2ad7-kube-api-access-cwhgh" (OuterVolumeSpecName: "kube-api-access-cwhgh") pod "01d783e4-455a-4e50-a5a6-ca1a289b2ad7" (UID: "01d783e4-455a-4e50-a5a6-ca1a289b2ad7"). InnerVolumeSpecName "kube-api-access-cwhgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.497354 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-667b6"] Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.499582 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmlgm\" (UniqueName: \"kubernetes.io/projected/5f1489d8-cd50-4c5f-b6f8-22854e49b246-kube-api-access-tmlgm\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.499613 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwhgh\" (UniqueName: \"kubernetes.io/projected/01d783e4-455a-4e50-a5a6-ca1a289b2ad7-kube-api-access-cwhgh\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.499622 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxtsr\" (UniqueName: \"kubernetes.io/projected/7258162b-c9ac-49b4-a0ef-ec5f19491efa-kube-api-access-dxtsr\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.510689 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.519482 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-599c96b4d8-46mbs"] Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.550727 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7fcd5cf54b-nqvjs"] Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.582724 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ea66-account-create-hsfw4" event={"ID":"5f1489d8-cd50-4c5f-b6f8-22854e49b246","Type":"ContainerDied","Data":"799e83d6e5f2bf3f0a6074621a97c4de6f3e58994c3112d778a802f3b7fadb20"} Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.582840 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="799e83d6e5f2bf3f0a6074621a97c4de6f3e58994c3112d778a802f3b7fadb20" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.582951 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ea66-account-create-hsfw4" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.600809 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbxhx\" (UniqueName: \"kubernetes.io/projected/8506c575-4d5d-4691-9ebd-75d8c878f9a9-kube-api-access-bbxhx\") pod \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.601065 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-config-data\") pod \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.601174 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-log-httpd\") pod \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.601268 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-run-httpd\") pod \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.601354 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-scripts\") pod \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.601425 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-combined-ca-bundle\") pod \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.601555 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-sg-core-conf-yaml\") pod \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\" (UID: \"8506c575-4d5d-4691-9ebd-75d8c878f9a9\") " Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.611810 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8506c575-4d5d-4691-9ebd-75d8c878f9a9" (UID: "8506c575-4d5d-4691-9ebd-75d8c878f9a9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.612989 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8506c575-4d5d-4691-9ebd-75d8c878f9a9" (UID: "8506c575-4d5d-4691-9ebd-75d8c878f9a9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.620830 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55fd7c674f-kfd9n" event={"ID":"39f45bdc-e25e-41f8-aefe-0dede18c4bb8","Type":"ContainerStarted","Data":"be711b1f11876d3ff8beebe4e732c37da10174f3bd0417925b7e288c00f88088"} Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.635301 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3a5f-account-create-2knz5" event={"ID":"01d783e4-455a-4e50-a5a6-ca1a289b2ad7","Type":"ContainerDied","Data":"f5ee0f9d59463794dd4c83a26f5f37b9ed2b15d69d3291cc6189268f11fe07be"} Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.635621 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5ee0f9d59463794dd4c83a26f5f37b9ed2b15d69d3291cc6189268f11fe07be" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.635800 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3a5f-account-create-2knz5" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.635983 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-scripts" (OuterVolumeSpecName: "scripts") pod "8506c575-4d5d-4691-9ebd-75d8c878f9a9" (UID: "8506c575-4d5d-4691-9ebd-75d8c878f9a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.636726 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8506c575-4d5d-4691-9ebd-75d8c878f9a9-kube-api-access-bbxhx" (OuterVolumeSpecName: "kube-api-access-bbxhx") pod "8506c575-4d5d-4691-9ebd-75d8c878f9a9" (UID: "8506c575-4d5d-4691-9ebd-75d8c878f9a9"). InnerVolumeSpecName "kube-api-access-bbxhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.668328 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8506c575-4d5d-4691-9ebd-75d8c878f9a9" (UID: "8506c575-4d5d-4691-9ebd-75d8c878f9a9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.676678 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8506c575-4d5d-4691-9ebd-75d8c878f9a9","Type":"ContainerDied","Data":"b9326a7ea2df5acfc940acd75d3fc2ee9b826a751cb8115960bd3f4a6de4b62b"} Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.676951 4735 scope.go:117] "RemoveContainer" containerID="77bae45e0a3516c6625bd67820d03f8c1de32b8759457e0fb94b7e4e62b09187" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.677187 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.690963 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"53b4ae38-a993-4a83-93bf-da796d4be856","Type":"ContainerStarted","Data":"f9ff0f7d511806a9d1ced122737cc1bf437f4037c550d200e97dc196c676d3e6"} Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.698776 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"971696de-af05-4d40-89d1-64a2688b08e0","Type":"ContainerStarted","Data":"395fff6b3840f287a1c4bf5bcea7c906995ddf411eb1a0779147e454e8025d70"} Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.703047 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cdcc-account-create-5vjp4" event={"ID":"7258162b-c9ac-49b4-a0ef-ec5f19491efa","Type":"ContainerDied","Data":"464e6c5a559f6d8acc5cc32967a108929e7d0a282bb6bdf134f5974124fbb44e"} Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.703158 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="464e6c5a559f6d8acc5cc32967a108929e7d0a282bb6bdf134f5974124fbb44e" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.703293 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cdcc-account-create-5vjp4" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.705878 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbxhx\" (UniqueName: \"kubernetes.io/projected/8506c575-4d5d-4691-9ebd-75d8c878f9a9-kube-api-access-bbxhx\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.705905 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.705915 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8506c575-4d5d-4691-9ebd-75d8c878f9a9-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.705925 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.705935 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.706020 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" event={"ID":"3fadfe66-81a6-4af1-9d8e-a65139585a11","Type":"ContainerStarted","Data":"6f68f1afbbddf759ddd5a1a81dd0bfb6df94b03c87af2b7d2620539e9aa7de2c"} Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.727956 4735 scope.go:117] "RemoveContainer" containerID="f16d05092237210f009ea9104512444793fc9c41bc32a45a487845d6d6f77605" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.748795 4735 scope.go:117] "RemoveContainer" containerID="f2fd5fba0b761cf62bfc989e53081dd9793c5448aae946a24d57446b23d5f2fb" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.782697 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-config-data" (OuterVolumeSpecName: "config-data") pod "8506c575-4d5d-4691-9ebd-75d8c878f9a9" (UID: "8506c575-4d5d-4691-9ebd-75d8c878f9a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.811721 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.813745 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8506c575-4d5d-4691-9ebd-75d8c878f9a9" (UID: "8506c575-4d5d-4691-9ebd-75d8c878f9a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:26 crc kubenswrapper[4735]: E1001 10:34:26.883912 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f1489d8_cd50_4c5f_b6f8_22854e49b246.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01d783e4_455a_4e50_a5a6_ca1a289b2ad7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f1489d8_cd50_4c5f_b6f8_22854e49b246.slice/crio-799e83d6e5f2bf3f0a6074621a97c4de6f3e58994c3112d778a802f3b7fadb20\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7258162b_c9ac_49b4_a0ef_ec5f19491efa.slice\": RecentStats: unable to find data in memory cache]" Oct 01 10:34:26 crc kubenswrapper[4735]: I1001 10:34:26.913246 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8506c575-4d5d-4691-9ebd-75d8c878f9a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.060546 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.075505 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084262 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:27 crc kubenswrapper[4735]: E1001 10:34:27.084597 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="ceilometer-central-agent" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084609 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="ceilometer-central-agent" Oct 01 10:34:27 crc kubenswrapper[4735]: E1001 10:34:27.084617 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1489d8-cd50-4c5f-b6f8-22854e49b246" containerName="mariadb-account-create" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084623 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1489d8-cd50-4c5f-b6f8-22854e49b246" containerName="mariadb-account-create" Oct 01 10:34:27 crc kubenswrapper[4735]: E1001 10:34:27.084638 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d783e4-455a-4e50-a5a6-ca1a289b2ad7" containerName="mariadb-account-create" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084644 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d783e4-455a-4e50-a5a6-ca1a289b2ad7" containerName="mariadb-account-create" Oct 01 10:34:27 crc kubenswrapper[4735]: E1001 10:34:27.084655 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7258162b-c9ac-49b4-a0ef-ec5f19491efa" containerName="mariadb-account-create" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084661 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7258162b-c9ac-49b4-a0ef-ec5f19491efa" containerName="mariadb-account-create" Oct 01 10:34:27 crc kubenswrapper[4735]: E1001 10:34:27.084670 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="sg-core" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084676 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="sg-core" Oct 01 10:34:27 crc kubenswrapper[4735]: E1001 10:34:27.084695 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2b4d4c-d741-4a88-b71c-fda5946d3896" containerName="neutron-httpd" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084701 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2b4d4c-d741-4a88-b71c-fda5946d3896" containerName="neutron-httpd" Oct 01 10:34:27 crc kubenswrapper[4735]: E1001 10:34:27.084711 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="ceilometer-notification-agent" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084716 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="ceilometer-notification-agent" Oct 01 10:34:27 crc kubenswrapper[4735]: E1001 10:34:27.084728 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2b4d4c-d741-4a88-b71c-fda5946d3896" containerName="neutron-api" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084734 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2b4d4c-d741-4a88-b71c-fda5946d3896" containerName="neutron-api" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084888 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="ceilometer-central-agent" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084901 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="ceilometer-notification-agent" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084909 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2b4d4c-d741-4a88-b71c-fda5946d3896" containerName="neutron-api" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084919 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7258162b-c9ac-49b4-a0ef-ec5f19491efa" containerName="mariadb-account-create" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084926 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" containerName="sg-core" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084934 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2b4d4c-d741-4a88-b71c-fda5946d3896" containerName="neutron-httpd" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084945 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d783e4-455a-4e50-a5a6-ca1a289b2ad7" containerName="mariadb-account-create" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.084959 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1489d8-cd50-4c5f-b6f8-22854e49b246" containerName="mariadb-account-create" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.087187 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.090167 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.090850 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.098471 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.217529 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-log-httpd\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.217879 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77h24\" (UniqueName: \"kubernetes.io/projected/7549dccc-93b5-4e12-9cec-cc4bbde202d4-kube-api-access-77h24\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.217900 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-scripts\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.217915 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.217951 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.217969 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-run-httpd\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.217989 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-config-data\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.319104 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.319147 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-run-httpd\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.319198 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-config-data\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.319267 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-log-httpd\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.319333 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77h24\" (UniqueName: \"kubernetes.io/projected/7549dccc-93b5-4e12-9cec-cc4bbde202d4-kube-api-access-77h24\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.319353 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-scripts\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.319370 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.320573 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-log-httpd\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.320627 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-run-httpd\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.325439 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.327411 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-scripts\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.327913 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.336755 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-config-data\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.341987 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77h24\" (UniqueName: \"kubernetes.io/projected/7549dccc-93b5-4e12-9cec-cc4bbde202d4-kube-api-access-77h24\") pod \"ceilometer-0\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.413199 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.724556 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"53b4ae38-a993-4a83-93bf-da796d4be856","Type":"ContainerStarted","Data":"d0e30f6866aa745f7e3ca6e07b09b957351584b29c9d3bc7483f549115929cfb"} Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.740673 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" event={"ID":"a7e968c0-6535-481f-8583-7fdb64a2a42a","Type":"ContainerStarted","Data":"7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba"} Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.740722 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" event={"ID":"a7e968c0-6535-481f-8583-7fdb64a2a42a","Type":"ContainerStarted","Data":"bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a"} Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.740731 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" event={"ID":"a7e968c0-6535-481f-8583-7fdb64a2a42a","Type":"ContainerStarted","Data":"98e04b75214927027e94218a19122b1e0908c9bb9a7ccc94343fbcc438458004"} Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.743613 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"971696de-af05-4d40-89d1-64a2688b08e0","Type":"ContainerStarted","Data":"b82340480b193a14b70e215446db2562643f3ad9558e40e34a158083e7fe966c"} Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.743653 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"971696de-af05-4d40-89d1-64a2688b08e0","Type":"ContainerStarted","Data":"2a80dded94ae2932af1b1250398ef968b4ec679635502aac3365460390c5dca1"} Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.755199 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" event={"ID":"4ce28744-bfa2-4674-a495-02abf8245d38","Type":"ContainerStarted","Data":"2821d9972309a0deecf6ca40bcb3ab2e631fee9f1d5560494f66a3c5a1f9be97"} Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.757023 4735 generic.go:334] "Generic (PLEG): container finished" podID="3fadfe66-81a6-4af1-9d8e-a65139585a11" containerID="1570d784f61d2fa72addb295b30cfeb3a8da22a170e7aa616a9028ac85f28bdd" exitCode=0 Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.757050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" event={"ID":"3fadfe66-81a6-4af1-9d8e-a65139585a11","Type":"ContainerDied","Data":"1570d784f61d2fa72addb295b30cfeb3a8da22a170e7aa616a9028ac85f28bdd"} Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.766894 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.766873934 podStartE2EDuration="3.766873934s" podCreationTimestamp="2025-10-01 10:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:34:27.76558733 +0000 UTC m=+1026.458408612" watchObservedRunningTime="2025-10-01 10:34:27.766873934 +0000 UTC m=+1026.459695206" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.789111 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" podStartSLOduration=3.788956495 podStartE2EDuration="3.788956495s" podCreationTimestamp="2025-10-01 10:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:34:27.785543594 +0000 UTC m=+1026.478364866" watchObservedRunningTime="2025-10-01 10:34:27.788956495 +0000 UTC m=+1026.481777757" Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.895350 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:27 crc kubenswrapper[4735]: I1001 10:34:27.912219 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8506c575-4d5d-4691-9ebd-75d8c878f9a9" path="/var/lib/kubelet/pods/8506c575-4d5d-4691-9ebd-75d8c878f9a9/volumes" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.075423 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7ddbc654c4-6bddt"] Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.077267 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.080583 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.081034 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.089942 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ddbc654c4-6bddt"] Oct 01 10:34:28 crc kubenswrapper[4735]: W1001 10:34:28.132515 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7549dccc_93b5_4e12_9cec_cc4bbde202d4.slice/crio-b30475ce360590b3c0e3856bb3a31e2ed94a6646e459d91fbd0906bb583c4c00 WatchSource:0}: Error finding container b30475ce360590b3c0e3856bb3a31e2ed94a6646e459d91fbd0906bb583c4c00: Status 404 returned error can't find the container with id b30475ce360590b3c0e3856bb3a31e2ed94a6646e459d91fbd0906bb583c4c00 Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.236014 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-combined-ca-bundle\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.236101 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-public-tls-certs\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.236140 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2bh5\" (UniqueName: \"kubernetes.io/projected/e4554523-3ac0-4f1d-8a5d-f8892d72229d-kube-api-access-r2bh5\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.236182 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4554523-3ac0-4f1d-8a5d-f8892d72229d-logs\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.236277 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-internal-tls-certs\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.236335 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-config-data-custom\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.236445 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-config-data\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.338131 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-combined-ca-bundle\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.338512 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-public-tls-certs\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.338535 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2bh5\" (UniqueName: \"kubernetes.io/projected/e4554523-3ac0-4f1d-8a5d-f8892d72229d-kube-api-access-r2bh5\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.338565 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4554523-3ac0-4f1d-8a5d-f8892d72229d-logs\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.338599 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-internal-tls-certs\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.338624 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-config-data-custom\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.338664 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-config-data\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.339531 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4554523-3ac0-4f1d-8a5d-f8892d72229d-logs\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.342583 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-combined-ca-bundle\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.343333 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-internal-tls-certs\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.343538 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-public-tls-certs\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.343741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-config-data\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.344954 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4554523-3ac0-4f1d-8a5d-f8892d72229d-config-data-custom\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.354983 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2bh5\" (UniqueName: \"kubernetes.io/projected/e4554523-3ac0-4f1d-8a5d-f8892d72229d-kube-api-access-r2bh5\") pod \"barbican-api-7ddbc654c4-6bddt\" (UID: \"e4554523-3ac0-4f1d-8a5d-f8892d72229d\") " pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.393577 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.767562 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"53b4ae38-a993-4a83-93bf-da796d4be856","Type":"ContainerStarted","Data":"224080cb6cf22d89e1f61be31bae6449972445457fda82125e65b23b31df8ebd"} Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.769843 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55fd7c674f-kfd9n" event={"ID":"39f45bdc-e25e-41f8-aefe-0dede18c4bb8","Type":"ContainerStarted","Data":"208d727a6750bc833606fc75bed6b6a1bd0c47d097e9d9d49914820c231c9320"} Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.785304 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7549dccc-93b5-4e12-9cec-cc4bbde202d4","Type":"ContainerStarted","Data":"b30475ce360590b3c0e3856bb3a31e2ed94a6646e459d91fbd0906bb583c4c00"} Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.795373 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.795357844 podStartE2EDuration="4.795357844s" podCreationTimestamp="2025-10-01 10:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:34:28.794305236 +0000 UTC m=+1027.487126498" watchObservedRunningTime="2025-10-01 10:34:28.795357844 +0000 UTC m=+1027.488179096" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.804888 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" event={"ID":"4ce28744-bfa2-4674-a495-02abf8245d38","Type":"ContainerStarted","Data":"405153af949c014bd84cc28f9322baa25af1c539c1f3f71753b0f9cc630f606d"} Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.807418 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" event={"ID":"3fadfe66-81a6-4af1-9d8e-a65139585a11","Type":"ContainerStarted","Data":"61957b35911e804bfe5137cd82eebbcc4d6cdd566ab0e5a91bf0845ef28eb84b"} Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.808445 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.808481 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.809283 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:28 crc kubenswrapper[4735]: I1001 10:34:28.841788 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" podStartSLOduration=4.841770435 podStartE2EDuration="4.841770435s" podCreationTimestamp="2025-10-01 10:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:34:28.825285354 +0000 UTC m=+1027.518106616" watchObservedRunningTime="2025-10-01 10:34:28.841770435 +0000 UTC m=+1027.534591697" Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.038680 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ddbc654c4-6bddt"] Oct 01 10:34:29 crc kubenswrapper[4735]: W1001 10:34:29.052347 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4554523_3ac0_4f1d_8a5d_f8892d72229d.slice/crio-ae74584f984ec8dfb3aeab64564ba07e0a3e44cce2789e35a5314abba2a2b736 WatchSource:0}: Error finding container ae74584f984ec8dfb3aeab64564ba07e0a3e44cce2789e35a5314abba2a2b736: Status 404 returned error can't find the container with id ae74584f984ec8dfb3aeab64564ba07e0a3e44cce2789e35a5314abba2a2b736 Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.376002 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.826129 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55fd7c674f-kfd9n" event={"ID":"39f45bdc-e25e-41f8-aefe-0dede18c4bb8","Type":"ContainerStarted","Data":"4a132c6642d80ad97d32a3e8230494543ad4950827e76acfb777d826e10d3802"} Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.828000 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ddbc654c4-6bddt" event={"ID":"e4554523-3ac0-4f1d-8a5d-f8892d72229d","Type":"ContainerStarted","Data":"7a2e6307d00acd87814227f07ce65ff6a2d6ae2f24a987e74deefee9296794fa"} Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.828041 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ddbc654c4-6bddt" event={"ID":"e4554523-3ac0-4f1d-8a5d-f8892d72229d","Type":"ContainerStarted","Data":"e3aecbda6eed59d6dfdfbbb50b7abb9abc96e7c2a53f6a683262af78e983ae78"} Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.828055 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ddbc654c4-6bddt" event={"ID":"e4554523-3ac0-4f1d-8a5d-f8892d72229d","Type":"ContainerStarted","Data":"ae74584f984ec8dfb3aeab64564ba07e0a3e44cce2789e35a5314abba2a2b736"} Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.828079 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.828104 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.829802 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7549dccc-93b5-4e12-9cec-cc4bbde202d4","Type":"ContainerStarted","Data":"9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526"} Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.831450 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" event={"ID":"4ce28744-bfa2-4674-a495-02abf8245d38","Type":"ContainerStarted","Data":"5518e2d6b82b0f43745c2972522fa88b430f18c9b1158bf912e6e612650ca0ab"} Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.847572 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-55fd7c674f-kfd9n" podStartSLOduration=2.883258728 podStartE2EDuration="5.847551598s" podCreationTimestamp="2025-10-01 10:34:24 +0000 UTC" firstStartedPulling="2025-10-01 10:34:25.600666424 +0000 UTC m=+1024.293487686" lastFinishedPulling="2025-10-01 10:34:28.564959304 +0000 UTC m=+1027.257780556" observedRunningTime="2025-10-01 10:34:29.842901624 +0000 UTC m=+1028.535722886" watchObservedRunningTime="2025-10-01 10:34:29.847551598 +0000 UTC m=+1028.540372860" Oct 01 10:34:29 crc kubenswrapper[4735]: I1001 10:34:29.860923 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7ddbc654c4-6bddt" podStartSLOduration=1.8609067449999999 podStartE2EDuration="1.860906745s" podCreationTimestamp="2025-10-01 10:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:34:29.857248287 +0000 UTC m=+1028.550069549" watchObservedRunningTime="2025-10-01 10:34:29.860906745 +0000 UTC m=+1028.553727997" Oct 01 10:34:30 crc kubenswrapper[4735]: I1001 10:34:30.843021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7549dccc-93b5-4e12-9cec-cc4bbde202d4","Type":"ContainerStarted","Data":"b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e"} Oct 01 10:34:30 crc kubenswrapper[4735]: I1001 10:34:30.843072 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7549dccc-93b5-4e12-9cec-cc4bbde202d4","Type":"ContainerStarted","Data":"2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f"} Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.737584 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-599c96b4d8-46mbs" podStartSLOduration=5.741788091 podStartE2EDuration="7.737564774s" podCreationTimestamp="2025-10-01 10:34:24 +0000 UTC" firstStartedPulling="2025-10-01 10:34:26.564845925 +0000 UTC m=+1025.257667187" lastFinishedPulling="2025-10-01 10:34:28.560622608 +0000 UTC m=+1027.253443870" observedRunningTime="2025-10-01 10:34:29.884204669 +0000 UTC m=+1028.577025931" watchObservedRunningTime="2025-10-01 10:34:31.737564774 +0000 UTC m=+1030.430386066" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.741919 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x5q6q"] Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.743483 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.748085 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.748662 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f4xb6" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.748893 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.752863 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x5q6q"] Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.806728 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-config-data\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.806788 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkz7v\" (UniqueName: \"kubernetes.io/projected/caf22975-46d3-4339-87d4-e693baddc266-kube-api-access-dkz7v\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.806842 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-scripts\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.806858 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.909769 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkz7v\" (UniqueName: \"kubernetes.io/projected/caf22975-46d3-4339-87d4-e693baddc266-kube-api-access-dkz7v\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.909847 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-scripts\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.909869 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.909969 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-config-data\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.922997 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.923639 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-scripts\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.931227 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-config-data\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:31 crc kubenswrapper[4735]: I1001 10:34:31.945618 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkz7v\" (UniqueName: \"kubernetes.io/projected/caf22975-46d3-4339-87d4-e693baddc266-kube-api-access-dkz7v\") pod \"nova-cell0-conductor-db-sync-x5q6q\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:32 crc kubenswrapper[4735]: I1001 10:34:32.059632 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:34:32 crc kubenswrapper[4735]: I1001 10:34:32.625554 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x5q6q"] Oct 01 10:34:32 crc kubenswrapper[4735]: I1001 10:34:32.862251 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7549dccc-93b5-4e12-9cec-cc4bbde202d4","Type":"ContainerStarted","Data":"c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7"} Oct 01 10:34:32 crc kubenswrapper[4735]: I1001 10:34:32.862535 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="ceilometer-central-agent" containerID="cri-o://9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526" gracePeriod=30 Oct 01 10:34:32 crc kubenswrapper[4735]: I1001 10:34:32.862586 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="ceilometer-notification-agent" containerID="cri-o://2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f" gracePeriod=30 Oct 01 10:34:32 crc kubenswrapper[4735]: I1001 10:34:32.862622 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="sg-core" containerID="cri-o://b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e" gracePeriod=30 Oct 01 10:34:32 crc kubenswrapper[4735]: I1001 10:34:32.862556 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="proxy-httpd" containerID="cri-o://c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7" gracePeriod=30 Oct 01 10:34:32 crc kubenswrapper[4735]: I1001 10:34:32.863879 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 10:34:32 crc kubenswrapper[4735]: I1001 10:34:32.864379 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x5q6q" event={"ID":"caf22975-46d3-4339-87d4-e693baddc266","Type":"ContainerStarted","Data":"f31654e62454b37b06eb185abe76eee1f24cda541bb06fa11cb1e8be72986db7"} Oct 01 10:34:32 crc kubenswrapper[4735]: I1001 10:34:32.899762 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.462694197 podStartE2EDuration="5.899744668s" podCreationTimestamp="2025-10-01 10:34:27 +0000 UTC" firstStartedPulling="2025-10-01 10:34:28.470743975 +0000 UTC m=+1027.163565237" lastFinishedPulling="2025-10-01 10:34:31.907794446 +0000 UTC m=+1030.600615708" observedRunningTime="2025-10-01 10:34:32.89682561 +0000 UTC m=+1031.589646872" watchObservedRunningTime="2025-10-01 10:34:32.899744668 +0000 UTC m=+1031.592565930" Oct 01 10:34:33 crc kubenswrapper[4735]: I1001 10:34:33.878629 4735 generic.go:334] "Generic (PLEG): container finished" podID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerID="c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7" exitCode=0 Oct 01 10:34:33 crc kubenswrapper[4735]: I1001 10:34:33.878992 4735 generic.go:334] "Generic (PLEG): container finished" podID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerID="b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e" exitCode=2 Oct 01 10:34:33 crc kubenswrapper[4735]: I1001 10:34:33.879007 4735 generic.go:334] "Generic (PLEG): container finished" podID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerID="2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f" exitCode=0 Oct 01 10:34:33 crc kubenswrapper[4735]: I1001 10:34:33.878708 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7549dccc-93b5-4e12-9cec-cc4bbde202d4","Type":"ContainerDied","Data":"c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7"} Oct 01 10:34:33 crc kubenswrapper[4735]: I1001 10:34:33.879053 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7549dccc-93b5-4e12-9cec-cc4bbde202d4","Type":"ContainerDied","Data":"b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e"} Oct 01 10:34:33 crc kubenswrapper[4735]: I1001 10:34:33.879070 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7549dccc-93b5-4e12-9cec-cc4bbde202d4","Type":"ContainerDied","Data":"2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f"} Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.644282 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.772976 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-combined-ca-bundle\") pod \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.773044 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77h24\" (UniqueName: \"kubernetes.io/projected/7549dccc-93b5-4e12-9cec-cc4bbde202d4-kube-api-access-77h24\") pod \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.773071 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-log-httpd\") pod \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.773093 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-config-data\") pod \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.773114 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-sg-core-conf-yaml\") pod \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.773204 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-scripts\") pod \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.773316 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-run-httpd\") pod \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\" (UID: \"7549dccc-93b5-4e12-9cec-cc4bbde202d4\") " Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.773845 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7549dccc-93b5-4e12-9cec-cc4bbde202d4" (UID: "7549dccc-93b5-4e12-9cec-cc4bbde202d4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.773917 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7549dccc-93b5-4e12-9cec-cc4bbde202d4" (UID: "7549dccc-93b5-4e12-9cec-cc4bbde202d4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.782675 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-scripts" (OuterVolumeSpecName: "scripts") pod "7549dccc-93b5-4e12-9cec-cc4bbde202d4" (UID: "7549dccc-93b5-4e12-9cec-cc4bbde202d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.798597 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7549dccc-93b5-4e12-9cec-cc4bbde202d4-kube-api-access-77h24" (OuterVolumeSpecName: "kube-api-access-77h24") pod "7549dccc-93b5-4e12-9cec-cc4bbde202d4" (UID: "7549dccc-93b5-4e12-9cec-cc4bbde202d4"). InnerVolumeSpecName "kube-api-access-77h24". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.807609 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7549dccc-93b5-4e12-9cec-cc4bbde202d4" (UID: "7549dccc-93b5-4e12-9cec-cc4bbde202d4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.875553 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.875588 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77h24\" (UniqueName: \"kubernetes.io/projected/7549dccc-93b5-4e12-9cec-cc4bbde202d4-kube-api-access-77h24\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.875599 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7549dccc-93b5-4e12-9cec-cc4bbde202d4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.875607 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.875616 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.889677 4735 generic.go:334] "Generic (PLEG): container finished" podID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerID="9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526" exitCode=0 Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.889719 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7549dccc-93b5-4e12-9cec-cc4bbde202d4","Type":"ContainerDied","Data":"9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526"} Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.889744 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7549dccc-93b5-4e12-9cec-cc4bbde202d4","Type":"ContainerDied","Data":"b30475ce360590b3c0e3856bb3a31e2ed94a6646e459d91fbd0906bb583c4c00"} Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.889759 4735 scope.go:117] "RemoveContainer" containerID="c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.889974 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.892966 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7549dccc-93b5-4e12-9cec-cc4bbde202d4" (UID: "7549dccc-93b5-4e12-9cec-cc4bbde202d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.908616 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-config-data" (OuterVolumeSpecName: "config-data") pod "7549dccc-93b5-4e12-9cec-cc4bbde202d4" (UID: "7549dccc-93b5-4e12-9cec-cc4bbde202d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.928694 4735 scope.go:117] "RemoveContainer" containerID="b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.952467 4735 scope.go:117] "RemoveContainer" containerID="2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.971507 4735 scope.go:117] "RemoveContainer" containerID="9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.978014 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.978039 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7549dccc-93b5-4e12-9cec-cc4bbde202d4-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.995537 4735 scope.go:117] "RemoveContainer" containerID="c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7" Oct 01 10:34:34 crc kubenswrapper[4735]: E1001 10:34:34.996428 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7\": container with ID starting with c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7 not found: ID does not exist" containerID="c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.996476 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7"} err="failed to get container status \"c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7\": rpc error: code = NotFound desc = could not find container \"c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7\": container with ID starting with c66294ac23ab171d9b9db2776d8b6eadf2204eca5d5f113ba4b5f22abac038a7 not found: ID does not exist" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.996517 4735 scope.go:117] "RemoveContainer" containerID="b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e" Oct 01 10:34:34 crc kubenswrapper[4735]: E1001 10:34:34.997251 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e\": container with ID starting with b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e not found: ID does not exist" containerID="b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.997285 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e"} err="failed to get container status \"b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e\": rpc error: code = NotFound desc = could not find container \"b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e\": container with ID starting with b720b59f7b2dc9f3f7c934d3084bdfb2b7a26361f0b1845573efa3135ec61c8e not found: ID does not exist" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.997307 4735 scope.go:117] "RemoveContainer" containerID="2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f" Oct 01 10:34:34 crc kubenswrapper[4735]: E1001 10:34:34.997866 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f\": container with ID starting with 2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f not found: ID does not exist" containerID="2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.997910 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f"} err="failed to get container status \"2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f\": rpc error: code = NotFound desc = could not find container \"2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f\": container with ID starting with 2c927ff04b88f019d8b1e88db7db3c2c074a5848b77b0e98d4df20dcc7a1e14f not found: ID does not exist" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.997961 4735 scope.go:117] "RemoveContainer" containerID="9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526" Oct 01 10:34:34 crc kubenswrapper[4735]: E1001 10:34:34.998255 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526\": container with ID starting with 9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526 not found: ID does not exist" containerID="9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526" Oct 01 10:34:34 crc kubenswrapper[4735]: I1001 10:34:34.998288 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526"} err="failed to get container status \"9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526\": rpc error: code = NotFound desc = could not find container \"9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526\": container with ID starting with 9def4c8fc2179cac5bb856fd171faeef1b41faaa4d7bf6087bc972fdaa38b526 not found: ID does not exist" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.226929 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.231250 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.245472 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:35 crc kubenswrapper[4735]: E1001 10:34:35.249581 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="proxy-httpd" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.249608 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="proxy-httpd" Oct 01 10:34:35 crc kubenswrapper[4735]: E1001 10:34:35.249625 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="sg-core" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.249633 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="sg-core" Oct 01 10:34:35 crc kubenswrapper[4735]: E1001 10:34:35.249659 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="ceilometer-central-agent" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.249666 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="ceilometer-central-agent" Oct 01 10:34:35 crc kubenswrapper[4735]: E1001 10:34:35.249687 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="ceilometer-notification-agent" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.249692 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="ceilometer-notification-agent" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.249904 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="proxy-httpd" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.249924 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="ceilometer-central-agent" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.249932 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="ceilometer-notification-agent" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.249941 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" containerName="sg-core" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.251445 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.253957 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.254000 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.257147 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.257198 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.268667 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.300213 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.300739 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.348711 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.348752 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.378446 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.384103 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.384355 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-log-httpd\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.384477 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-run-httpd\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.384524 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8m82\" (UniqueName: \"kubernetes.io/projected/0aecc00b-0696-48f2-a28e-6572d3735880-kube-api-access-x8m82\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.384566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.384582 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-config-data\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.384624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-scripts\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.387763 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.428072 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.486575 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-run-httpd\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.486844 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8m82\" (UniqueName: \"kubernetes.io/projected/0aecc00b-0696-48f2-a28e-6572d3735880-kube-api-access-x8m82\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.486957 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.487056 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-config-data\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.487160 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-scripts\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.487289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.487381 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-g2kcn"] Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.487607 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-log-httpd\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.487667 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" podUID="fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" containerName="dnsmasq-dns" containerID="cri-o://d59a3e86a73d107d2ef3e7aecd07aff4439ab508b907181c3bda693c666be0e3" gracePeriod=10 Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.489152 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-run-httpd\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.492828 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-log-httpd\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.496009 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.496525 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-scripts\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.500705 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-config-data\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.506361 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.516875 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8m82\" (UniqueName: \"kubernetes.io/projected/0aecc00b-0696-48f2-a28e-6572d3735880-kube-api-access-x8m82\") pod \"ceilometer-0\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.620441 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.909657 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7549dccc-93b5-4e12-9cec-cc4bbde202d4" path="/var/lib/kubelet/pods/7549dccc-93b5-4e12-9cec-cc4bbde202d4/volumes" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.924452 4735 generic.go:334] "Generic (PLEG): container finished" podID="fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" containerID="d59a3e86a73d107d2ef3e7aecd07aff4439ab508b907181c3bda693c666be0e3" exitCode=0 Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.924539 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" event={"ID":"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e","Type":"ContainerDied","Data":"d59a3e86a73d107d2ef3e7aecd07aff4439ab508b907181c3bda693c666be0e3"} Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.925328 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.925362 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.925373 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:35 crc kubenswrapper[4735]: I1001 10:34:35.925381 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:36 crc kubenswrapper[4735]: I1001 10:34:36.836781 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:37 crc kubenswrapper[4735]: I1001 10:34:37.534036 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:37 crc kubenswrapper[4735]: I1001 10:34:37.872145 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:37 crc kubenswrapper[4735]: I1001 10:34:37.872781 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 10:34:38 crc kubenswrapper[4735]: I1001 10:34:38.009849 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 10:34:38 crc kubenswrapper[4735]: I1001 10:34:38.009966 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 10:34:38 crc kubenswrapper[4735]: I1001 10:34:38.549006 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 10:34:39 crc kubenswrapper[4735]: I1001 10:34:39.748728 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:39 crc kubenswrapper[4735]: I1001 10:34:39.793172 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ddbc654c4-6bddt" Oct 01 10:34:39 crc kubenswrapper[4735]: I1001 10:34:39.853573 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7fcd5cf54b-nqvjs"] Oct 01 10:34:39 crc kubenswrapper[4735]: I1001 10:34:39.853797 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" podUID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerName="barbican-api-log" containerID="cri-o://bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a" gracePeriod=30 Oct 01 10:34:39 crc kubenswrapper[4735]: I1001 10:34:39.854189 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" podUID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerName="barbican-api" containerID="cri-o://7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba" gracePeriod=30 Oct 01 10:34:40 crc kubenswrapper[4735]: I1001 10:34:40.450448 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" podUID="fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.156:5353: connect: connection refused" Oct 01 10:34:40 crc kubenswrapper[4735]: I1001 10:34:40.991302 4735 generic.go:334] "Generic (PLEG): container finished" podID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerID="bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a" exitCode=143 Oct 01 10:34:40 crc kubenswrapper[4735]: I1001 10:34:40.991381 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" event={"ID":"a7e968c0-6535-481f-8583-7fdb64a2a42a","Type":"ContainerDied","Data":"bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a"} Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.230161 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.325263 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-sb\") pod \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.325337 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-svc\") pod \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.325394 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-swift-storage-0\") pod \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.325479 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-nb\") pod \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.325600 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7qbh\" (UniqueName: \"kubernetes.io/projected/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-kube-api-access-r7qbh\") pod \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.325715 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-config\") pod \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\" (UID: \"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e\") " Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.336358 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-kube-api-access-r7qbh" (OuterVolumeSpecName: "kube-api-access-r7qbh") pod "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" (UID: "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e"). InnerVolumeSpecName "kube-api-access-r7qbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.388755 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" (UID: "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.394247 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" (UID: "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.396934 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" (UID: "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.406894 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.408571 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-config" (OuterVolumeSpecName: "config") pod "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" (UID: "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.424686 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" (UID: "fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.427915 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7qbh\" (UniqueName: \"kubernetes.io/projected/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-kube-api-access-r7qbh\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.427933 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.427943 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.427951 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.427960 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.427967 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:42 crc kubenswrapper[4735]: I1001 10:34:42.521327 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:34:42 crc kubenswrapper[4735]: W1001 10:34:42.542410 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aecc00b_0696_48f2_a28e_6572d3735880.slice/crio-ff8f0675d8d095a11cf0d18560575ba0c22c042999d6e3834ea6209c6d4d71da WatchSource:0}: Error finding container ff8f0675d8d095a11cf0d18560575ba0c22c042999d6e3834ea6209c6d4d71da: Status 404 returned error can't find the container with id ff8f0675d8d095a11cf0d18560575ba0c22c042999d6e3834ea6209c6d4d71da Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.009791 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x5q6q" event={"ID":"caf22975-46d3-4339-87d4-e693baddc266","Type":"ContainerStarted","Data":"17635a0cbfdcdbf14c7c896afa961ca2f921b7899ebb651337f393fd1ba050e8"} Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.012184 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" event={"ID":"fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e","Type":"ContainerDied","Data":"5144aa29f5ee863ea50838f1bd82ee74c373730beb94051d113bd4ec53f5e999"} Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.012218 4735 scope.go:117] "RemoveContainer" containerID="d59a3e86a73d107d2ef3e7aecd07aff4439ab508b907181c3bda693c666be0e3" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.012363 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-g2kcn" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.016415 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0aecc00b-0696-48f2-a28e-6572d3735880","Type":"ContainerStarted","Data":"ff8f0675d8d095a11cf0d18560575ba0c22c042999d6e3834ea6209c6d4d71da"} Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.028513 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" podUID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": read tcp 10.217.0.2:43104->10.217.0.173:9311: read: connection reset by peer" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.028611 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" podUID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": read tcp 10.217.0.2:43092->10.217.0.173:9311: read: connection reset by peer" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.033293 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-x5q6q" podStartSLOduration=2.238079 podStartE2EDuration="12.033275278s" podCreationTimestamp="2025-10-01 10:34:31 +0000 UTC" firstStartedPulling="2025-10-01 10:34:32.63242739 +0000 UTC m=+1031.325248652" lastFinishedPulling="2025-10-01 10:34:42.427623668 +0000 UTC m=+1041.120444930" observedRunningTime="2025-10-01 10:34:43.032283842 +0000 UTC m=+1041.725105104" watchObservedRunningTime="2025-10-01 10:34:43.033275278 +0000 UTC m=+1041.726096540" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.040259 4735 scope.go:117] "RemoveContainer" containerID="38a005e749c834ab5d772a75cf1f153fadc4bd76075efd9d5da299f369be7096" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.054537 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-g2kcn"] Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.060566 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-g2kcn"] Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.430978 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.444490 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hkc9\" (UniqueName: \"kubernetes.io/projected/a7e968c0-6535-481f-8583-7fdb64a2a42a-kube-api-access-8hkc9\") pod \"a7e968c0-6535-481f-8583-7fdb64a2a42a\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.444536 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7e968c0-6535-481f-8583-7fdb64a2a42a-logs\") pod \"a7e968c0-6535-481f-8583-7fdb64a2a42a\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.444569 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data-custom\") pod \"a7e968c0-6535-481f-8583-7fdb64a2a42a\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.444616 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data\") pod \"a7e968c0-6535-481f-8583-7fdb64a2a42a\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.444706 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-combined-ca-bundle\") pod \"a7e968c0-6535-481f-8583-7fdb64a2a42a\" (UID: \"a7e968c0-6535-481f-8583-7fdb64a2a42a\") " Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.445153 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7e968c0-6535-481f-8583-7fdb64a2a42a-logs" (OuterVolumeSpecName: "logs") pod "a7e968c0-6535-481f-8583-7fdb64a2a42a" (UID: "a7e968c0-6535-481f-8583-7fdb64a2a42a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.456443 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e968c0-6535-481f-8583-7fdb64a2a42a-kube-api-access-8hkc9" (OuterVolumeSpecName: "kube-api-access-8hkc9") pod "a7e968c0-6535-481f-8583-7fdb64a2a42a" (UID: "a7e968c0-6535-481f-8583-7fdb64a2a42a"). InnerVolumeSpecName "kube-api-access-8hkc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.459151 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a7e968c0-6535-481f-8583-7fdb64a2a42a" (UID: "a7e968c0-6535-481f-8583-7fdb64a2a42a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.476556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7e968c0-6535-481f-8583-7fdb64a2a42a" (UID: "a7e968c0-6535-481f-8583-7fdb64a2a42a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.498689 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data" (OuterVolumeSpecName: "config-data") pod "a7e968c0-6535-481f-8583-7fdb64a2a42a" (UID: "a7e968c0-6535-481f-8583-7fdb64a2a42a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.546090 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.546127 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hkc9\" (UniqueName: \"kubernetes.io/projected/a7e968c0-6535-481f-8583-7fdb64a2a42a-kube-api-access-8hkc9\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.546142 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7e968c0-6535-481f-8583-7fdb64a2a42a-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.546153 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.546164 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7e968c0-6535-481f-8583-7fdb64a2a42a-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:34:43 crc kubenswrapper[4735]: I1001 10:34:43.911963 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" path="/var/lib/kubelet/pods/fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e/volumes" Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.025924 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-84vvz" event={"ID":"61dd2f37-7f60-42f5-a3d0-3b693d1e64be","Type":"ContainerStarted","Data":"6a5fb0e85c2303b725219da07142a08eb01cd52b0fd1b3767b58c64bec46bcd0"} Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.029038 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0aecc00b-0696-48f2-a28e-6572d3735880","Type":"ContainerStarted","Data":"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3"} Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.037963 4735 generic.go:334] "Generic (PLEG): container finished" podID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerID="7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba" exitCode=0 Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.038402 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.038568 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" event={"ID":"a7e968c0-6535-481f-8583-7fdb64a2a42a","Type":"ContainerDied","Data":"7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba"} Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.038707 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcd5cf54b-nqvjs" event={"ID":"a7e968c0-6535-481f-8583-7fdb64a2a42a","Type":"ContainerDied","Data":"98e04b75214927027e94218a19122b1e0908c9bb9a7ccc94343fbcc438458004"} Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.038742 4735 scope.go:117] "RemoveContainer" containerID="7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba" Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.046133 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-84vvz" podStartSLOduration=2.965375582 podStartE2EDuration="1m17.046110748s" podCreationTimestamp="2025-10-01 10:33:27 +0000 UTC" firstStartedPulling="2025-10-01 10:33:28.323937305 +0000 UTC m=+967.016758567" lastFinishedPulling="2025-10-01 10:34:42.404672471 +0000 UTC m=+1041.097493733" observedRunningTime="2025-10-01 10:34:44.044262578 +0000 UTC m=+1042.737083840" watchObservedRunningTime="2025-10-01 10:34:44.046110748 +0000 UTC m=+1042.738932010" Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.065550 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7fcd5cf54b-nqvjs"] Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.069070 4735 scope.go:117] "RemoveContainer" containerID="bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a" Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.073694 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7fcd5cf54b-nqvjs"] Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.085448 4735 scope.go:117] "RemoveContainer" containerID="7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba" Oct 01 10:34:44 crc kubenswrapper[4735]: E1001 10:34:44.085827 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba\": container with ID starting with 7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba not found: ID does not exist" containerID="7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba" Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.085877 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba"} err="failed to get container status \"7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba\": rpc error: code = NotFound desc = could not find container \"7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba\": container with ID starting with 7167c0f960acfac3c80a01a81a1597f8c9f2c89ad10dde96d94f58e9f5d0d2ba not found: ID does not exist" Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.085910 4735 scope.go:117] "RemoveContainer" containerID="bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a" Oct 01 10:34:44 crc kubenswrapper[4735]: E1001 10:34:44.086292 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a\": container with ID starting with bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a not found: ID does not exist" containerID="bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a" Oct 01 10:34:44 crc kubenswrapper[4735]: I1001 10:34:44.086330 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a"} err="failed to get container status \"bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a\": rpc error: code = NotFound desc = could not find container \"bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a\": container with ID starting with bcf8cfd28bdc60525407bf68a2b587a982080d8af68eabe2dbd525ddb9676f2a not found: ID does not exist" Oct 01 10:34:45 crc kubenswrapper[4735]: I1001 10:34:45.927954 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e968c0-6535-481f-8583-7fdb64a2a42a" path="/var/lib/kubelet/pods/a7e968c0-6535-481f-8583-7fdb64a2a42a/volumes" Oct 01 10:34:58 crc kubenswrapper[4735]: I1001 10:34:58.200425 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0aecc00b-0696-48f2-a28e-6572d3735880","Type":"ContainerStarted","Data":"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1"} Oct 01 10:34:59 crc kubenswrapper[4735]: I1001 10:34:59.211009 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0aecc00b-0696-48f2-a28e-6572d3735880","Type":"ContainerStarted","Data":"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7"} Oct 01 10:35:01 crc kubenswrapper[4735]: I1001 10:35:01.229162 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0aecc00b-0696-48f2-a28e-6572d3735880","Type":"ContainerStarted","Data":"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2"} Oct 01 10:35:01 crc kubenswrapper[4735]: I1001 10:35:01.229687 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 10:35:01 crc kubenswrapper[4735]: I1001 10:35:01.229370 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="proxy-httpd" containerID="cri-o://4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2" gracePeriod=30 Oct 01 10:35:01 crc kubenswrapper[4735]: I1001 10:35:01.229327 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="ceilometer-central-agent" containerID="cri-o://0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3" gracePeriod=30 Oct 01 10:35:01 crc kubenswrapper[4735]: I1001 10:35:01.229403 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="ceilometer-notification-agent" containerID="cri-o://9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1" gracePeriod=30 Oct 01 10:35:01 crc kubenswrapper[4735]: I1001 10:35:01.229417 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="sg-core" containerID="cri-o://d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7" gracePeriod=30 Oct 01 10:35:01 crc kubenswrapper[4735]: I1001 10:35:01.275033 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=8.653257782 podStartE2EDuration="26.275015043s" podCreationTimestamp="2025-10-01 10:34:35 +0000 UTC" firstStartedPulling="2025-10-01 10:34:42.54535992 +0000 UTC m=+1041.238181172" lastFinishedPulling="2025-10-01 10:35:00.167117171 +0000 UTC m=+1058.859938433" observedRunningTime="2025-10-01 10:35:01.269530166 +0000 UTC m=+1059.962351458" watchObservedRunningTime="2025-10-01 10:35:01.275015043 +0000 UTC m=+1059.967836325" Oct 01 10:35:01 crc kubenswrapper[4735]: I1001 10:35:01.902374 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.006024 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-combined-ca-bundle\") pod \"0aecc00b-0696-48f2-a28e-6572d3735880\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.006078 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-log-httpd\") pod \"0aecc00b-0696-48f2-a28e-6572d3735880\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.006174 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-config-data\") pod \"0aecc00b-0696-48f2-a28e-6572d3735880\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.006255 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-scripts\") pod \"0aecc00b-0696-48f2-a28e-6572d3735880\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.006283 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-run-httpd\") pod \"0aecc00b-0696-48f2-a28e-6572d3735880\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.006366 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8m82\" (UniqueName: \"kubernetes.io/projected/0aecc00b-0696-48f2-a28e-6572d3735880-kube-api-access-x8m82\") pod \"0aecc00b-0696-48f2-a28e-6572d3735880\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.006393 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-sg-core-conf-yaml\") pod \"0aecc00b-0696-48f2-a28e-6572d3735880\" (UID: \"0aecc00b-0696-48f2-a28e-6572d3735880\") " Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.007164 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0aecc00b-0696-48f2-a28e-6572d3735880" (UID: "0aecc00b-0696-48f2-a28e-6572d3735880"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.007310 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0aecc00b-0696-48f2-a28e-6572d3735880" (UID: "0aecc00b-0696-48f2-a28e-6572d3735880"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.011907 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-scripts" (OuterVolumeSpecName: "scripts") pod "0aecc00b-0696-48f2-a28e-6572d3735880" (UID: "0aecc00b-0696-48f2-a28e-6572d3735880"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.013841 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aecc00b-0696-48f2-a28e-6572d3735880-kube-api-access-x8m82" (OuterVolumeSpecName: "kube-api-access-x8m82") pod "0aecc00b-0696-48f2-a28e-6572d3735880" (UID: "0aecc00b-0696-48f2-a28e-6572d3735880"). InnerVolumeSpecName "kube-api-access-x8m82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.042854 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0aecc00b-0696-48f2-a28e-6572d3735880" (UID: "0aecc00b-0696-48f2-a28e-6572d3735880"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.089260 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0aecc00b-0696-48f2-a28e-6572d3735880" (UID: "0aecc00b-0696-48f2-a28e-6572d3735880"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.108945 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.109121 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.109215 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8m82\" (UniqueName: \"kubernetes.io/projected/0aecc00b-0696-48f2-a28e-6572d3735880-kube-api-access-x8m82\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.109272 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.109326 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.109384 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0aecc00b-0696-48f2-a28e-6572d3735880-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.123956 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-config-data" (OuterVolumeSpecName: "config-data") pod "0aecc00b-0696-48f2-a28e-6572d3735880" (UID: "0aecc00b-0696-48f2-a28e-6572d3735880"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.210929 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aecc00b-0696-48f2-a28e-6572d3735880-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.242577 4735 generic.go:334] "Generic (PLEG): container finished" podID="0aecc00b-0696-48f2-a28e-6572d3735880" containerID="4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2" exitCode=0 Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.242627 4735 generic.go:334] "Generic (PLEG): container finished" podID="0aecc00b-0696-48f2-a28e-6572d3735880" containerID="d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7" exitCode=2 Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.242644 4735 generic.go:334] "Generic (PLEG): container finished" podID="0aecc00b-0696-48f2-a28e-6572d3735880" containerID="9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1" exitCode=0 Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.242659 4735 generic.go:334] "Generic (PLEG): container finished" podID="0aecc00b-0696-48f2-a28e-6572d3735880" containerID="0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3" exitCode=0 Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.242686 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0aecc00b-0696-48f2-a28e-6572d3735880","Type":"ContainerDied","Data":"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2"} Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.242702 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.242738 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0aecc00b-0696-48f2-a28e-6572d3735880","Type":"ContainerDied","Data":"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7"} Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.242752 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0aecc00b-0696-48f2-a28e-6572d3735880","Type":"ContainerDied","Data":"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1"} Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.242760 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0aecc00b-0696-48f2-a28e-6572d3735880","Type":"ContainerDied","Data":"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3"} Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.242770 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0aecc00b-0696-48f2-a28e-6572d3735880","Type":"ContainerDied","Data":"ff8f0675d8d095a11cf0d18560575ba0c22c042999d6e3834ea6209c6d4d71da"} Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.242787 4735 scope.go:117] "RemoveContainer" containerID="4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.269701 4735 scope.go:117] "RemoveContainer" containerID="d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.300930 4735 scope.go:117] "RemoveContainer" containerID="9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.308082 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.321306 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.353583 4735 scope.go:117] "RemoveContainer" containerID="0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.366479 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.373332 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" containerName="dnsmasq-dns" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.373384 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" containerName="dnsmasq-dns" Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.373419 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="sg-core" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.373433 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="sg-core" Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.373457 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="proxy-httpd" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.373473 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="proxy-httpd" Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.373543 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerName="barbican-api-log" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.373580 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerName="barbican-api-log" Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.373619 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="ceilometer-central-agent" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.373630 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="ceilometer-central-agent" Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.373659 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerName="barbican-api" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.373667 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerName="barbican-api" Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.373704 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="ceilometer-notification-agent" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.373713 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="ceilometer-notification-agent" Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.373746 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" containerName="init" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.373755 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" containerName="init" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.374717 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerName="barbican-api" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.374785 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcaa5360-5913-4ff8-b03a-62b3a6ea0c3e" containerName="dnsmasq-dns" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.374828 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e968c0-6535-481f-8583-7fdb64a2a42a" containerName="barbican-api-log" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.374873 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="ceilometer-notification-agent" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.374932 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="ceilometer-central-agent" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.374951 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="sg-core" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.375327 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" containerName="proxy-httpd" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.380731 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.385809 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.386014 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.387710 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.429032 4735 scope.go:117] "RemoveContainer" containerID="4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2" Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.429440 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2\": container with ID starting with 4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2 not found: ID does not exist" containerID="4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.429478 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2"} err="failed to get container status \"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2\": rpc error: code = NotFound desc = could not find container \"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2\": container with ID starting with 4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.429524 4735 scope.go:117] "RemoveContainer" containerID="d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7" Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.429906 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7\": container with ID starting with d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7 not found: ID does not exist" containerID="d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.429939 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7"} err="failed to get container status \"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7\": rpc error: code = NotFound desc = could not find container \"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7\": container with ID starting with d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.429961 4735 scope.go:117] "RemoveContainer" containerID="9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1" Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.430216 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1\": container with ID starting with 9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1 not found: ID does not exist" containerID="9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.430238 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1"} err="failed to get container status \"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1\": rpc error: code = NotFound desc = could not find container \"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1\": container with ID starting with 9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.430255 4735 scope.go:117] "RemoveContainer" containerID="0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3" Oct 01 10:35:02 crc kubenswrapper[4735]: E1001 10:35:02.430536 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3\": container with ID starting with 0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3 not found: ID does not exist" containerID="0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.430555 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3"} err="failed to get container status \"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3\": rpc error: code = NotFound desc = could not find container \"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3\": container with ID starting with 0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.430567 4735 scope.go:117] "RemoveContainer" containerID="4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.430961 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2"} err="failed to get container status \"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2\": rpc error: code = NotFound desc = could not find container \"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2\": container with ID starting with 4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.431091 4735 scope.go:117] "RemoveContainer" containerID="d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.431681 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7"} err="failed to get container status \"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7\": rpc error: code = NotFound desc = could not find container \"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7\": container with ID starting with d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.431701 4735 scope.go:117] "RemoveContainer" containerID="9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.432172 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1"} err="failed to get container status \"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1\": rpc error: code = NotFound desc = could not find container \"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1\": container with ID starting with 9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.432408 4735 scope.go:117] "RemoveContainer" containerID="0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.432782 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3"} err="failed to get container status \"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3\": rpc error: code = NotFound desc = could not find container \"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3\": container with ID starting with 0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.432809 4735 scope.go:117] "RemoveContainer" containerID="4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.433120 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2"} err="failed to get container status \"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2\": rpc error: code = NotFound desc = could not find container \"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2\": container with ID starting with 4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.433142 4735 scope.go:117] "RemoveContainer" containerID="d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.433399 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7"} err="failed to get container status \"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7\": rpc error: code = NotFound desc = could not find container \"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7\": container with ID starting with d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.433422 4735 scope.go:117] "RemoveContainer" containerID="9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.433848 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1"} err="failed to get container status \"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1\": rpc error: code = NotFound desc = could not find container \"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1\": container with ID starting with 9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.433885 4735 scope.go:117] "RemoveContainer" containerID="0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.434473 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3"} err="failed to get container status \"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3\": rpc error: code = NotFound desc = could not find container \"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3\": container with ID starting with 0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.434526 4735 scope.go:117] "RemoveContainer" containerID="4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.434844 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2"} err="failed to get container status \"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2\": rpc error: code = NotFound desc = could not find container \"4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2\": container with ID starting with 4ded50d1315bb1c5af713dc362ddab0410f11c07516b85ab37df1e61580815d2 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.434881 4735 scope.go:117] "RemoveContainer" containerID="d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.435145 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7"} err="failed to get container status \"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7\": rpc error: code = NotFound desc = could not find container \"d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7\": container with ID starting with d6ccc48eb4751639785a4a7fe81d81ff83129c7b17b3031d2115c300f0b98cf7 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.435178 4735 scope.go:117] "RemoveContainer" containerID="9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.435573 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1"} err="failed to get container status \"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1\": rpc error: code = NotFound desc = could not find container \"9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1\": container with ID starting with 9e5a5046f685b867f88c9c488b9ad6aec6767f319cde442e6bc659c5aed427d1 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.435654 4735 scope.go:117] "RemoveContainer" containerID="0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.435928 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3"} err="failed to get container status \"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3\": rpc error: code = NotFound desc = could not find container \"0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3\": container with ID starting with 0638c7742fb6ebd3387acbdb8c39cf1f15b08f99b230fadc1648674a90171ef3 not found: ID does not exist" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.514705 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-config-data\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.514771 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.514998 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-scripts\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.515209 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfdhh\" (UniqueName: \"kubernetes.io/projected/956cbf0c-5a26-476c-aed1-911c53b9f80e-kube-api-access-mfdhh\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.515368 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.515520 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-run-httpd\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.515661 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-log-httpd\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.617779 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-scripts\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.617852 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfdhh\" (UniqueName: \"kubernetes.io/projected/956cbf0c-5a26-476c-aed1-911c53b9f80e-kube-api-access-mfdhh\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.617886 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.617916 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-run-httpd\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.617941 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-log-httpd\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.617965 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-config-data\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.617988 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.618883 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-log-httpd\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.619273 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-run-httpd\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.624467 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-scripts\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.629363 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-config-data\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.629542 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.640424 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.644978 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfdhh\" (UniqueName: \"kubernetes.io/projected/956cbf0c-5a26-476c-aed1-911c53b9f80e-kube-api-access-mfdhh\") pod \"ceilometer-0\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " pod="openstack/ceilometer-0" Oct 01 10:35:02 crc kubenswrapper[4735]: I1001 10:35:02.729160 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:03 crc kubenswrapper[4735]: I1001 10:35:03.167406 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:03 crc kubenswrapper[4735]: W1001 10:35:03.171215 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod956cbf0c_5a26_476c_aed1_911c53b9f80e.slice/crio-9477c000c7da190cff68a3e396b394dbe9c0d6da3b3d8f73939240f04c37c083 WatchSource:0}: Error finding container 9477c000c7da190cff68a3e396b394dbe9c0d6da3b3d8f73939240f04c37c083: Status 404 returned error can't find the container with id 9477c000c7da190cff68a3e396b394dbe9c0d6da3b3d8f73939240f04c37c083 Oct 01 10:35:03 crc kubenswrapper[4735]: I1001 10:35:03.259117 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"956cbf0c-5a26-476c-aed1-911c53b9f80e","Type":"ContainerStarted","Data":"9477c000c7da190cff68a3e396b394dbe9c0d6da3b3d8f73939240f04c37c083"} Oct 01 10:35:03 crc kubenswrapper[4735]: I1001 10:35:03.909820 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aecc00b-0696-48f2-a28e-6572d3735880" path="/var/lib/kubelet/pods/0aecc00b-0696-48f2-a28e-6572d3735880/volumes" Oct 01 10:35:04 crc kubenswrapper[4735]: I1001 10:35:04.270232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"956cbf0c-5a26-476c-aed1-911c53b9f80e","Type":"ContainerStarted","Data":"3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a"} Oct 01 10:35:05 crc kubenswrapper[4735]: I1001 10:35:05.280997 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"956cbf0c-5a26-476c-aed1-911c53b9f80e","Type":"ContainerStarted","Data":"83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64"} Oct 01 10:35:06 crc kubenswrapper[4735]: I1001 10:35:06.299670 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"956cbf0c-5a26-476c-aed1-911c53b9f80e","Type":"ContainerStarted","Data":"aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c"} Oct 01 10:35:06 crc kubenswrapper[4735]: I1001 10:35:06.304185 4735 generic.go:334] "Generic (PLEG): container finished" podID="61dd2f37-7f60-42f5-a3d0-3b693d1e64be" containerID="6a5fb0e85c2303b725219da07142a08eb01cd52b0fd1b3767b58c64bec46bcd0" exitCode=0 Oct 01 10:35:06 crc kubenswrapper[4735]: I1001 10:35:06.304242 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-84vvz" event={"ID":"61dd2f37-7f60-42f5-a3d0-3b693d1e64be","Type":"ContainerDied","Data":"6a5fb0e85c2303b725219da07142a08eb01cd52b0fd1b3767b58c64bec46bcd0"} Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.331862 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"956cbf0c-5a26-476c-aed1-911c53b9f80e","Type":"ContainerStarted","Data":"fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563"} Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.332786 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.363867 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6035329680000001 podStartE2EDuration="5.363850168s" podCreationTimestamp="2025-10-01 10:35:02 +0000 UTC" firstStartedPulling="2025-10-01 10:35:03.173331531 +0000 UTC m=+1061.866152793" lastFinishedPulling="2025-10-01 10:35:06.933648691 +0000 UTC m=+1065.626469993" observedRunningTime="2025-10-01 10:35:07.355436572 +0000 UTC m=+1066.048257834" watchObservedRunningTime="2025-10-01 10:35:07.363850168 +0000 UTC m=+1066.056671430" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.734730 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-84vvz" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.812864 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-db-sync-config-data\") pod \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.812932 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbrm9\" (UniqueName: \"kubernetes.io/projected/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-kube-api-access-mbrm9\") pod \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.812975 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-scripts\") pod \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.813079 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-combined-ca-bundle\") pod \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.813110 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-etc-machine-id\") pod \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.813154 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-config-data\") pod \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\" (UID: \"61dd2f37-7f60-42f5-a3d0-3b693d1e64be\") " Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.813259 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "61dd2f37-7f60-42f5-a3d0-3b693d1e64be" (UID: "61dd2f37-7f60-42f5-a3d0-3b693d1e64be"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.813671 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.818420 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "61dd2f37-7f60-42f5-a3d0-3b693d1e64be" (UID: "61dd2f37-7f60-42f5-a3d0-3b693d1e64be"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.818457 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-kube-api-access-mbrm9" (OuterVolumeSpecName: "kube-api-access-mbrm9") pod "61dd2f37-7f60-42f5-a3d0-3b693d1e64be" (UID: "61dd2f37-7f60-42f5-a3d0-3b693d1e64be"). InnerVolumeSpecName "kube-api-access-mbrm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.818932 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-scripts" (OuterVolumeSpecName: "scripts") pod "61dd2f37-7f60-42f5-a3d0-3b693d1e64be" (UID: "61dd2f37-7f60-42f5-a3d0-3b693d1e64be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.845909 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61dd2f37-7f60-42f5-a3d0-3b693d1e64be" (UID: "61dd2f37-7f60-42f5-a3d0-3b693d1e64be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.875090 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-config-data" (OuterVolumeSpecName: "config-data") pod "61dd2f37-7f60-42f5-a3d0-3b693d1e64be" (UID: "61dd2f37-7f60-42f5-a3d0-3b693d1e64be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.914856 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.914883 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.914893 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.914902 4735 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:07 crc kubenswrapper[4735]: I1001 10:35:07.914910 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbrm9\" (UniqueName: \"kubernetes.io/projected/61dd2f37-7f60-42f5-a3d0-3b693d1e64be-kube-api-access-mbrm9\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.345793 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-84vvz" event={"ID":"61dd2f37-7f60-42f5-a3d0-3b693d1e64be","Type":"ContainerDied","Data":"79bb3c3ebe3f4ba53348cf93b6beeaec50fe3ff1506f9799886e91f5ccab8fe7"} Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.345850 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79bb3c3ebe3f4ba53348cf93b6beeaec50fe3ff1506f9799886e91f5ccab8fe7" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.345853 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-84vvz" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.650171 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 10:35:08 crc kubenswrapper[4735]: E1001 10:35:08.650905 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61dd2f37-7f60-42f5-a3d0-3b693d1e64be" containerName="cinder-db-sync" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.650929 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="61dd2f37-7f60-42f5-a3d0-3b693d1e64be" containerName="cinder-db-sync" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.651178 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="61dd2f37-7f60-42f5-a3d0-3b693d1e64be" containerName="cinder-db-sync" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.652390 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.658249 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.658387 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xbgnd" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.658596 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.658877 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.672742 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.735167 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpvkj\" (UniqueName: \"kubernetes.io/projected/2c1e6c13-48e7-43b8-a911-ebd0f315e447-kube-api-access-wpvkj\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.735304 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.735361 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1e6c13-48e7-43b8-a911-ebd0f315e447-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.735418 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.735472 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-scripts\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.735506 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.746568 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-vbx9b"] Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.747987 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.760059 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-vbx9b"] Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.837781 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-config\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.837960 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-svc\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.838007 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.838067 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1e6c13-48e7-43b8-a911-ebd0f315e447-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.838116 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.838156 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.838179 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.838211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.838237 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hfqd\" (UniqueName: \"kubernetes.io/projected/309b0308-003c-4df0-880f-3b35d5607b1c-kube-api-access-9hfqd\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.838275 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-scripts\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.838301 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.838374 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpvkj\" (UniqueName: \"kubernetes.io/projected/2c1e6c13-48e7-43b8-a911-ebd0f315e447-kube-api-access-wpvkj\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.838564 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1e6c13-48e7-43b8-a911-ebd0f315e447-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.844141 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-scripts\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.845226 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.849194 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.860089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpvkj\" (UniqueName: \"kubernetes.io/projected/2c1e6c13-48e7-43b8-a911-ebd0f315e447-kube-api-access-wpvkj\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.862180 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.941483 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-config\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.941814 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-svc\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.941963 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.942060 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.942140 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.942227 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hfqd\" (UniqueName: \"kubernetes.io/projected/309b0308-003c-4df0-880f-3b35d5607b1c-kube-api-access-9hfqd\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.943559 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-svc\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.944264 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.944973 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.945463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.945784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-config\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.959527 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.960993 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.964622 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.968053 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.986718 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 10:35:08 crc kubenswrapper[4735]: I1001 10:35:08.990011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hfqd\" (UniqueName: \"kubernetes.io/projected/309b0308-003c-4df0-880f-3b35d5607b1c-kube-api-access-9hfqd\") pod \"dnsmasq-dns-5784cf869f-vbx9b\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.054061 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data-custom\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.056768 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ac276dd4-f407-4deb-971e-16dd7ed91d3e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.056939 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-scripts\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.057053 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.057285 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29rc9\" (UniqueName: \"kubernetes.io/projected/ac276dd4-f407-4deb-971e-16dd7ed91d3e-kube-api-access-29rc9\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.057406 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.057551 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac276dd4-f407-4deb-971e-16dd7ed91d3e-logs\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.065402 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.160581 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29rc9\" (UniqueName: \"kubernetes.io/projected/ac276dd4-f407-4deb-971e-16dd7ed91d3e-kube-api-access-29rc9\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.160888 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.160920 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac276dd4-f407-4deb-971e-16dd7ed91d3e-logs\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.160971 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data-custom\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.160998 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ac276dd4-f407-4deb-971e-16dd7ed91d3e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.161021 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-scripts\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.161050 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.162249 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac276dd4-f407-4deb-971e-16dd7ed91d3e-logs\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.162299 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ac276dd4-f407-4deb-971e-16dd7ed91d3e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.165340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.172431 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data-custom\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.173535 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-scripts\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.177956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.182170 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29rc9\" (UniqueName: \"kubernetes.io/projected/ac276dd4-f407-4deb-971e-16dd7ed91d3e-kube-api-access-29rc9\") pod \"cinder-api-0\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.229545 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.538271 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.626309 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-vbx9b"] Oct 01 10:35:09 crc kubenswrapper[4735]: W1001 10:35:09.651126 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod309b0308_003c_4df0_880f_3b35d5607b1c.slice/crio-af9a914305e5f37c484378af83c2264b458861fd898a5cae2774c680766cee66 WatchSource:0}: Error finding container af9a914305e5f37c484378af83c2264b458861fd898a5cae2774c680766cee66: Status 404 returned error can't find the container with id af9a914305e5f37c484378af83c2264b458861fd898a5cae2774c680766cee66 Oct 01 10:35:09 crc kubenswrapper[4735]: I1001 10:35:09.731461 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 10:35:09 crc kubenswrapper[4735]: W1001 10:35:09.765402 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac276dd4_f407_4deb_971e_16dd7ed91d3e.slice/crio-f24fe84d6b6b68064e0413414474dabe3826605a4cbc1467b607aaa5857c6393 WatchSource:0}: Error finding container f24fe84d6b6b68064e0413414474dabe3826605a4cbc1467b607aaa5857c6393: Status 404 returned error can't find the container with id f24fe84d6b6b68064e0413414474dabe3826605a4cbc1467b607aaa5857c6393 Oct 01 10:35:10 crc kubenswrapper[4735]: I1001 10:35:10.365675 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c1e6c13-48e7-43b8-a911-ebd0f315e447","Type":"ContainerStarted","Data":"4c540ff1e335a7929798cdb1f8abaa1648409fadf86aa9343193530163ff2dbd"} Oct 01 10:35:10 crc kubenswrapper[4735]: I1001 10:35:10.367517 4735 generic.go:334] "Generic (PLEG): container finished" podID="309b0308-003c-4df0-880f-3b35d5607b1c" containerID="a51a28404f6679b6145737c9315e52a490a6570b56f0dae57bd561f2af63f201" exitCode=0 Oct 01 10:35:10 crc kubenswrapper[4735]: I1001 10:35:10.367626 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" event={"ID":"309b0308-003c-4df0-880f-3b35d5607b1c","Type":"ContainerDied","Data":"a51a28404f6679b6145737c9315e52a490a6570b56f0dae57bd561f2af63f201"} Oct 01 10:35:10 crc kubenswrapper[4735]: I1001 10:35:10.367683 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" event={"ID":"309b0308-003c-4df0-880f-3b35d5607b1c","Type":"ContainerStarted","Data":"af9a914305e5f37c484378af83c2264b458861fd898a5cae2774c680766cee66"} Oct 01 10:35:10 crc kubenswrapper[4735]: I1001 10:35:10.369948 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ac276dd4-f407-4deb-971e-16dd7ed91d3e","Type":"ContainerStarted","Data":"f24fe84d6b6b68064e0413414474dabe3826605a4cbc1467b607aaa5857c6393"} Oct 01 10:35:10 crc kubenswrapper[4735]: I1001 10:35:10.524252 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 10:35:11 crc kubenswrapper[4735]: I1001 10:35:11.380431 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ac276dd4-f407-4deb-971e-16dd7ed91d3e","Type":"ContainerStarted","Data":"fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6"} Oct 01 10:35:11 crc kubenswrapper[4735]: I1001 10:35:11.382561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" event={"ID":"309b0308-003c-4df0-880f-3b35d5607b1c","Type":"ContainerStarted","Data":"1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7"} Oct 01 10:35:11 crc kubenswrapper[4735]: I1001 10:35:11.382957 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:11 crc kubenswrapper[4735]: I1001 10:35:11.408360 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" podStartSLOduration=3.408343402 podStartE2EDuration="3.408343402s" podCreationTimestamp="2025-10-01 10:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:35:11.401401135 +0000 UTC m=+1070.094222397" watchObservedRunningTime="2025-10-01 10:35:11.408343402 +0000 UTC m=+1070.101164664" Oct 01 10:35:12 crc kubenswrapper[4735]: I1001 10:35:12.393674 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ac276dd4-f407-4deb-971e-16dd7ed91d3e","Type":"ContainerStarted","Data":"c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930"} Oct 01 10:35:12 crc kubenswrapper[4735]: I1001 10:35:12.394273 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 10:35:12 crc kubenswrapper[4735]: I1001 10:35:12.393952 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ac276dd4-f407-4deb-971e-16dd7ed91d3e" containerName="cinder-api" containerID="cri-o://c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930" gracePeriod=30 Oct 01 10:35:12 crc kubenswrapper[4735]: I1001 10:35:12.393844 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ac276dd4-f407-4deb-971e-16dd7ed91d3e" containerName="cinder-api-log" containerID="cri-o://fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6" gracePeriod=30 Oct 01 10:35:12 crc kubenswrapper[4735]: I1001 10:35:12.435464 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.435440974 podStartE2EDuration="4.435440974s" podCreationTimestamp="2025-10-01 10:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:35:12.423927555 +0000 UTC m=+1071.116748817" watchObservedRunningTime="2025-10-01 10:35:12.435440974 +0000 UTC m=+1071.128262236" Oct 01 10:35:12 crc kubenswrapper[4735]: I1001 10:35:12.934704 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.045211 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac276dd4-f407-4deb-971e-16dd7ed91d3e-logs\") pod \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.045280 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ac276dd4-f407-4deb-971e-16dd7ed91d3e-etc-machine-id\") pod \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.045298 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-scripts\") pod \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.045339 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-combined-ca-bundle\") pod \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.045362 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac276dd4-f407-4deb-971e-16dd7ed91d3e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ac276dd4-f407-4deb-971e-16dd7ed91d3e" (UID: "ac276dd4-f407-4deb-971e-16dd7ed91d3e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.045374 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data\") pod \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.045442 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data-custom\") pod \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.045569 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29rc9\" (UniqueName: \"kubernetes.io/projected/ac276dd4-f407-4deb-971e-16dd7ed91d3e-kube-api-access-29rc9\") pod \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\" (UID: \"ac276dd4-f407-4deb-971e-16dd7ed91d3e\") " Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.045723 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac276dd4-f407-4deb-971e-16dd7ed91d3e-logs" (OuterVolumeSpecName: "logs") pod "ac276dd4-f407-4deb-971e-16dd7ed91d3e" (UID: "ac276dd4-f407-4deb-971e-16dd7ed91d3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.046009 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac276dd4-f407-4deb-971e-16dd7ed91d3e-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.046035 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ac276dd4-f407-4deb-971e-16dd7ed91d3e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.050657 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-scripts" (OuterVolumeSpecName: "scripts") pod "ac276dd4-f407-4deb-971e-16dd7ed91d3e" (UID: "ac276dd4-f407-4deb-971e-16dd7ed91d3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.050754 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ac276dd4-f407-4deb-971e-16dd7ed91d3e" (UID: "ac276dd4-f407-4deb-971e-16dd7ed91d3e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.050667 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac276dd4-f407-4deb-971e-16dd7ed91d3e-kube-api-access-29rc9" (OuterVolumeSpecName: "kube-api-access-29rc9") pod "ac276dd4-f407-4deb-971e-16dd7ed91d3e" (UID: "ac276dd4-f407-4deb-971e-16dd7ed91d3e"). InnerVolumeSpecName "kube-api-access-29rc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.071414 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac276dd4-f407-4deb-971e-16dd7ed91d3e" (UID: "ac276dd4-f407-4deb-971e-16dd7ed91d3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.094538 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data" (OuterVolumeSpecName: "config-data") pod "ac276dd4-f407-4deb-971e-16dd7ed91d3e" (UID: "ac276dd4-f407-4deb-971e-16dd7ed91d3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.147651 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29rc9\" (UniqueName: \"kubernetes.io/projected/ac276dd4-f407-4deb-971e-16dd7ed91d3e-kube-api-access-29rc9\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.147687 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.147700 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.147713 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.147724 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac276dd4-f407-4deb-971e-16dd7ed91d3e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.408857 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c1e6c13-48e7-43b8-a911-ebd0f315e447","Type":"ContainerStarted","Data":"25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1"} Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.409148 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c1e6c13-48e7-43b8-a911-ebd0f315e447","Type":"ContainerStarted","Data":"735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05"} Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.413369 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac276dd4-f407-4deb-971e-16dd7ed91d3e" containerID="c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930" exitCode=0 Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.413425 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac276dd4-f407-4deb-971e-16dd7ed91d3e" containerID="fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6" exitCode=143 Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.413438 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ac276dd4-f407-4deb-971e-16dd7ed91d3e","Type":"ContainerDied","Data":"c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930"} Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.413479 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.413489 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ac276dd4-f407-4deb-971e-16dd7ed91d3e","Type":"ContainerDied","Data":"fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6"} Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.413535 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ac276dd4-f407-4deb-971e-16dd7ed91d3e","Type":"ContainerDied","Data":"f24fe84d6b6b68064e0413414474dabe3826605a4cbc1467b607aaa5857c6393"} Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.413558 4735 scope.go:117] "RemoveContainer" containerID="c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.433099 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.175852896 podStartE2EDuration="5.433082545s" podCreationTimestamp="2025-10-01 10:35:08 +0000 UTC" firstStartedPulling="2025-10-01 10:35:09.546248778 +0000 UTC m=+1068.239070040" lastFinishedPulling="2025-10-01 10:35:11.803478427 +0000 UTC m=+1070.496299689" observedRunningTime="2025-10-01 10:35:13.430973938 +0000 UTC m=+1072.123795200" watchObservedRunningTime="2025-10-01 10:35:13.433082545 +0000 UTC m=+1072.125903807" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.456782 4735 scope.go:117] "RemoveContainer" containerID="fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.471171 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.481744 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.505711 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 10:35:13 crc kubenswrapper[4735]: E1001 10:35:13.506166 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac276dd4-f407-4deb-971e-16dd7ed91d3e" containerName="cinder-api" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.506186 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac276dd4-f407-4deb-971e-16dd7ed91d3e" containerName="cinder-api" Oct 01 10:35:13 crc kubenswrapper[4735]: E1001 10:35:13.506204 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac276dd4-f407-4deb-971e-16dd7ed91d3e" containerName="cinder-api-log" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.506211 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac276dd4-f407-4deb-971e-16dd7ed91d3e" containerName="cinder-api-log" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.506398 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac276dd4-f407-4deb-971e-16dd7ed91d3e" containerName="cinder-api-log" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.506422 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac276dd4-f407-4deb-971e-16dd7ed91d3e" containerName="cinder-api" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.507619 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.511507 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.511812 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.512051 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.514905 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.551036 4735 scope.go:117] "RemoveContainer" containerID="c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930" Oct 01 10:35:13 crc kubenswrapper[4735]: E1001 10:35:13.551812 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930\": container with ID starting with c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930 not found: ID does not exist" containerID="c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.551853 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930"} err="failed to get container status \"c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930\": rpc error: code = NotFound desc = could not find container \"c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930\": container with ID starting with c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930 not found: ID does not exist" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.551879 4735 scope.go:117] "RemoveContainer" containerID="fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6" Oct 01 10:35:13 crc kubenswrapper[4735]: E1001 10:35:13.552217 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6\": container with ID starting with fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6 not found: ID does not exist" containerID="fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.552250 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6"} err="failed to get container status \"fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6\": rpc error: code = NotFound desc = could not find container \"fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6\": container with ID starting with fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6 not found: ID does not exist" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.552273 4735 scope.go:117] "RemoveContainer" containerID="c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.552550 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930"} err="failed to get container status \"c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930\": rpc error: code = NotFound desc = could not find container \"c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930\": container with ID starting with c5c550fa47359bdb4c50f04150b97ec02bcf392d12d916ae431f186d8830d930 not found: ID does not exist" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.552654 4735 scope.go:117] "RemoveContainer" containerID="fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.553222 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6"} err="failed to get container status \"fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6\": rpc error: code = NotFound desc = could not find container \"fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6\": container with ID starting with fabea8f76a67f07e174bd1d0929fab03a42a68342afef24fe2ab485eb0f3a2f6 not found: ID does not exist" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.659106 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.659174 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a00c66d0-b381-43e2-ae45-8635ff4f424e-logs\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.659400 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-scripts\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.659577 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a00c66d0-b381-43e2-ae45-8635ff4f424e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.659616 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-config-data\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.659652 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hvck\" (UniqueName: \"kubernetes.io/projected/a00c66d0-b381-43e2-ae45-8635ff4f424e-kube-api-access-9hvck\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.659736 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.659779 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.659925 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-config-data-custom\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.761955 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a00c66d0-b381-43e2-ae45-8635ff4f424e-logs\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.762083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-scripts\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.762132 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a00c66d0-b381-43e2-ae45-8635ff4f424e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.762156 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-config-data\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.762203 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hvck\" (UniqueName: \"kubernetes.io/projected/a00c66d0-b381-43e2-ae45-8635ff4f424e-kube-api-access-9hvck\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.762232 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a00c66d0-b381-43e2-ae45-8635ff4f424e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.762236 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.762325 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.762412 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-config-data-custom\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.762442 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a00c66d0-b381-43e2-ae45-8635ff4f424e-logs\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.762645 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.766930 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-config-data-custom\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.768169 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.768971 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-config-data\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.769635 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.771278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-scripts\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.772384 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00c66d0-b381-43e2-ae45-8635ff4f424e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.786402 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hvck\" (UniqueName: \"kubernetes.io/projected/a00c66d0-b381-43e2-ae45-8635ff4f424e-kube-api-access-9hvck\") pod \"cinder-api-0\" (UID: \"a00c66d0-b381-43e2-ae45-8635ff4f424e\") " pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.854197 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.912198 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac276dd4-f407-4deb-971e-16dd7ed91d3e" path="/var/lib/kubelet/pods/ac276dd4-f407-4deb-971e-16dd7ed91d3e/volumes" Oct 01 10:35:13 crc kubenswrapper[4735]: I1001 10:35:13.988512 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 10:35:14 crc kubenswrapper[4735]: I1001 10:35:14.304204 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 10:35:14 crc kubenswrapper[4735]: W1001 10:35:14.310587 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda00c66d0_b381_43e2_ae45_8635ff4f424e.slice/crio-4405dc2e587de3bae97b5f573cc791bdd7143365029bfd2f2221da2a70b1ecf4 WatchSource:0}: Error finding container 4405dc2e587de3bae97b5f573cc791bdd7143365029bfd2f2221da2a70b1ecf4: Status 404 returned error can't find the container with id 4405dc2e587de3bae97b5f573cc791bdd7143365029bfd2f2221da2a70b1ecf4 Oct 01 10:35:14 crc kubenswrapper[4735]: I1001 10:35:14.425031 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a00c66d0-b381-43e2-ae45-8635ff4f424e","Type":"ContainerStarted","Data":"4405dc2e587de3bae97b5f573cc791bdd7143365029bfd2f2221da2a70b1ecf4"} Oct 01 10:35:15 crc kubenswrapper[4735]: I1001 10:35:15.442202 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a00c66d0-b381-43e2-ae45-8635ff4f424e","Type":"ContainerStarted","Data":"175b47b82898d551afea1ebe786be0c75200aa60b428140cf441146ffebd1ed3"} Oct 01 10:35:16 crc kubenswrapper[4735]: I1001 10:35:16.452771 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a00c66d0-b381-43e2-ae45-8635ff4f424e","Type":"ContainerStarted","Data":"eee018206d9701a06d45e7e3a2724e7946ac17cc3bf21f13e0cdeabab7b8ff14"} Oct 01 10:35:16 crc kubenswrapper[4735]: I1001 10:35:16.453357 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 10:35:16 crc kubenswrapper[4735]: I1001 10:35:16.483857 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.483839013 podStartE2EDuration="3.483839013s" podCreationTimestamp="2025-10-01 10:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:35:16.479170766 +0000 UTC m=+1075.171992028" watchObservedRunningTime="2025-10-01 10:35:16.483839013 +0000 UTC m=+1075.176660275" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.066712 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.143278 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-667b6"] Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.143636 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" podUID="3fadfe66-81a6-4af1-9d8e-a65139585a11" containerName="dnsmasq-dns" containerID="cri-o://61957b35911e804bfe5137cd82eebbcc4d6cdd566ab0e5a91bf0845ef28eb84b" gracePeriod=10 Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.263916 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.310081 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.497619 4735 generic.go:334] "Generic (PLEG): container finished" podID="3fadfe66-81a6-4af1-9d8e-a65139585a11" containerID="61957b35911e804bfe5137cd82eebbcc4d6cdd566ab0e5a91bf0845ef28eb84b" exitCode=0 Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.497928 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2c1e6c13-48e7-43b8-a911-ebd0f315e447" containerName="cinder-scheduler" containerID="cri-o://735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05" gracePeriod=30 Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.498334 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" event={"ID":"3fadfe66-81a6-4af1-9d8e-a65139585a11","Type":"ContainerDied","Data":"61957b35911e804bfe5137cd82eebbcc4d6cdd566ab0e5a91bf0845ef28eb84b"} Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.498730 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2c1e6c13-48e7-43b8-a911-ebd0f315e447" containerName="probe" containerID="cri-o://25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1" gracePeriod=30 Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.712169 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.877173 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-sb\") pod \"3fadfe66-81a6-4af1-9d8e-a65139585a11\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.877246 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqs4z\" (UniqueName: \"kubernetes.io/projected/3fadfe66-81a6-4af1-9d8e-a65139585a11-kube-api-access-fqs4z\") pod \"3fadfe66-81a6-4af1-9d8e-a65139585a11\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.877288 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-nb\") pod \"3fadfe66-81a6-4af1-9d8e-a65139585a11\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.877327 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-swift-storage-0\") pod \"3fadfe66-81a6-4af1-9d8e-a65139585a11\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.877405 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-svc\") pod \"3fadfe66-81a6-4af1-9d8e-a65139585a11\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.877456 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-config\") pod \"3fadfe66-81a6-4af1-9d8e-a65139585a11\" (UID: \"3fadfe66-81a6-4af1-9d8e-a65139585a11\") " Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.889487 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fadfe66-81a6-4af1-9d8e-a65139585a11-kube-api-access-fqs4z" (OuterVolumeSpecName: "kube-api-access-fqs4z") pod "3fadfe66-81a6-4af1-9d8e-a65139585a11" (UID: "3fadfe66-81a6-4af1-9d8e-a65139585a11"). InnerVolumeSpecName "kube-api-access-fqs4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.940190 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3fadfe66-81a6-4af1-9d8e-a65139585a11" (UID: "3fadfe66-81a6-4af1-9d8e-a65139585a11"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.954387 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3fadfe66-81a6-4af1-9d8e-a65139585a11" (UID: "3fadfe66-81a6-4af1-9d8e-a65139585a11"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.954931 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3fadfe66-81a6-4af1-9d8e-a65139585a11" (UID: "3fadfe66-81a6-4af1-9d8e-a65139585a11"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.963973 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-config" (OuterVolumeSpecName: "config") pod "3fadfe66-81a6-4af1-9d8e-a65139585a11" (UID: "3fadfe66-81a6-4af1-9d8e-a65139585a11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.971106 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3fadfe66-81a6-4af1-9d8e-a65139585a11" (UID: "3fadfe66-81a6-4af1-9d8e-a65139585a11"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.979397 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.979457 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.979469 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.979486 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqs4z\" (UniqueName: \"kubernetes.io/projected/3fadfe66-81a6-4af1-9d8e-a65139585a11-kube-api-access-fqs4z\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.979517 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:19 crc kubenswrapper[4735]: I1001 10:35:19.979528 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fadfe66-81a6-4af1-9d8e-a65139585a11-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:20 crc kubenswrapper[4735]: I1001 10:35:20.508564 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" event={"ID":"3fadfe66-81a6-4af1-9d8e-a65139585a11","Type":"ContainerDied","Data":"6f68f1afbbddf759ddd5a1a81dd0bfb6df94b03c87af2b7d2620539e9aa7de2c"} Oct 01 10:35:20 crc kubenswrapper[4735]: I1001 10:35:20.508588 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-667b6" Oct 01 10:35:20 crc kubenswrapper[4735]: I1001 10:35:20.508940 4735 scope.go:117] "RemoveContainer" containerID="61957b35911e804bfe5137cd82eebbcc4d6cdd566ab0e5a91bf0845ef28eb84b" Oct 01 10:35:20 crc kubenswrapper[4735]: I1001 10:35:20.510650 4735 generic.go:334] "Generic (PLEG): container finished" podID="2c1e6c13-48e7-43b8-a911-ebd0f315e447" containerID="25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1" exitCode=0 Oct 01 10:35:20 crc kubenswrapper[4735]: I1001 10:35:20.510685 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c1e6c13-48e7-43b8-a911-ebd0f315e447","Type":"ContainerDied","Data":"25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1"} Oct 01 10:35:20 crc kubenswrapper[4735]: I1001 10:35:20.546245 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-667b6"] Oct 01 10:35:20 crc kubenswrapper[4735]: I1001 10:35:20.546384 4735 scope.go:117] "RemoveContainer" containerID="1570d784f61d2fa72addb295b30cfeb3a8da22a170e7aa616a9028ac85f28bdd" Oct 01 10:35:20 crc kubenswrapper[4735]: I1001 10:35:20.562668 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-667b6"] Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.386273 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.502676 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data\") pod \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.502760 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data-custom\") pod \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.502780 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1e6c13-48e7-43b8-a911-ebd0f315e447-etc-machine-id\") pod \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.502903 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpvkj\" (UniqueName: \"kubernetes.io/projected/2c1e6c13-48e7-43b8-a911-ebd0f315e447-kube-api-access-wpvkj\") pod \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.502951 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-scripts\") pod \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.502990 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-combined-ca-bundle\") pod \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\" (UID: \"2c1e6c13-48e7-43b8-a911-ebd0f315e447\") " Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.503206 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c1e6c13-48e7-43b8-a911-ebd0f315e447-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2c1e6c13-48e7-43b8-a911-ebd0f315e447" (UID: "2c1e6c13-48e7-43b8-a911-ebd0f315e447"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.503611 4735 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2c1e6c13-48e7-43b8-a911-ebd0f315e447-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.508851 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2c1e6c13-48e7-43b8-a911-ebd0f315e447" (UID: "2c1e6c13-48e7-43b8-a911-ebd0f315e447"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.509744 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c1e6c13-48e7-43b8-a911-ebd0f315e447-kube-api-access-wpvkj" (OuterVolumeSpecName: "kube-api-access-wpvkj") pod "2c1e6c13-48e7-43b8-a911-ebd0f315e447" (UID: "2c1e6c13-48e7-43b8-a911-ebd0f315e447"). InnerVolumeSpecName "kube-api-access-wpvkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.514054 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-scripts" (OuterVolumeSpecName: "scripts") pod "2c1e6c13-48e7-43b8-a911-ebd0f315e447" (UID: "2c1e6c13-48e7-43b8-a911-ebd0f315e447"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.524911 4735 generic.go:334] "Generic (PLEG): container finished" podID="2c1e6c13-48e7-43b8-a911-ebd0f315e447" containerID="735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05" exitCode=0 Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.524952 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c1e6c13-48e7-43b8-a911-ebd0f315e447","Type":"ContainerDied","Data":"735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05"} Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.524978 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2c1e6c13-48e7-43b8-a911-ebd0f315e447","Type":"ContainerDied","Data":"4c540ff1e335a7929798cdb1f8abaa1648409fadf86aa9343193530163ff2dbd"} Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.524995 4735 scope.go:117] "RemoveContainer" containerID="25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.525091 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.572382 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c1e6c13-48e7-43b8-a911-ebd0f315e447" (UID: "2c1e6c13-48e7-43b8-a911-ebd0f315e447"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.605340 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpvkj\" (UniqueName: \"kubernetes.io/projected/2c1e6c13-48e7-43b8-a911-ebd0f315e447-kube-api-access-wpvkj\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.605375 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.605384 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.605393 4735 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.636255 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data" (OuterVolumeSpecName: "config-data") pod "2c1e6c13-48e7-43b8-a911-ebd0f315e447" (UID: "2c1e6c13-48e7-43b8-a911-ebd0f315e447"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.653215 4735 scope.go:117] "RemoveContainer" containerID="735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.673275 4735 scope.go:117] "RemoveContainer" containerID="25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1" Oct 01 10:35:21 crc kubenswrapper[4735]: E1001 10:35:21.673835 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1\": container with ID starting with 25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1 not found: ID does not exist" containerID="25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.673887 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1"} err="failed to get container status \"25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1\": rpc error: code = NotFound desc = could not find container \"25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1\": container with ID starting with 25d044bd36825b3af9885feac14372a816d42cbe95bda754e21783b3da8b64e1 not found: ID does not exist" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.673912 4735 scope.go:117] "RemoveContainer" containerID="735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05" Oct 01 10:35:21 crc kubenswrapper[4735]: E1001 10:35:21.674200 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05\": container with ID starting with 735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05 not found: ID does not exist" containerID="735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.674230 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05"} err="failed to get container status \"735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05\": rpc error: code = NotFound desc = could not find container \"735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05\": container with ID starting with 735aa55e39d2ff558bf6dc5c8cbf23d327c5f51141da53f7ecf1317872b6bc05 not found: ID does not exist" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.707205 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1e6c13-48e7-43b8-a911-ebd0f315e447-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.854527 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.862226 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.873606 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 10:35:21 crc kubenswrapper[4735]: E1001 10:35:21.873967 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fadfe66-81a6-4af1-9d8e-a65139585a11" containerName="init" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.873982 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fadfe66-81a6-4af1-9d8e-a65139585a11" containerName="init" Oct 01 10:35:21 crc kubenswrapper[4735]: E1001 10:35:21.873996 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c1e6c13-48e7-43b8-a911-ebd0f315e447" containerName="cinder-scheduler" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.874005 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c1e6c13-48e7-43b8-a911-ebd0f315e447" containerName="cinder-scheduler" Oct 01 10:35:21 crc kubenswrapper[4735]: E1001 10:35:21.874020 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c1e6c13-48e7-43b8-a911-ebd0f315e447" containerName="probe" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.874026 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c1e6c13-48e7-43b8-a911-ebd0f315e447" containerName="probe" Oct 01 10:35:21 crc kubenswrapper[4735]: E1001 10:35:21.874040 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fadfe66-81a6-4af1-9d8e-a65139585a11" containerName="dnsmasq-dns" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.874046 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fadfe66-81a6-4af1-9d8e-a65139585a11" containerName="dnsmasq-dns" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.874201 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c1e6c13-48e7-43b8-a911-ebd0f315e447" containerName="probe" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.874228 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c1e6c13-48e7-43b8-a911-ebd0f315e447" containerName="cinder-scheduler" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.874237 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fadfe66-81a6-4af1-9d8e-a65139585a11" containerName="dnsmasq-dns" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.875091 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.877112 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.887234 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.931392 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c1e6c13-48e7-43b8-a911-ebd0f315e447" path="/var/lib/kubelet/pods/2c1e6c13-48e7-43b8-a911-ebd0f315e447/volumes" Oct 01 10:35:21 crc kubenswrapper[4735]: I1001 10:35:21.936822 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fadfe66-81a6-4af1-9d8e-a65139585a11" path="/var/lib/kubelet/pods/3fadfe66-81a6-4af1-9d8e-a65139585a11/volumes" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.020747 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.020806 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-config-data\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.020897 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn668\" (UniqueName: \"kubernetes.io/projected/c7925532-6ff1-4914-84b0-8206d0ad5225-kube-api-access-dn668\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.020946 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7925532-6ff1-4914-84b0-8206d0ad5225-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.020971 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.021015 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-scripts\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.122694 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.122742 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-config-data\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.122826 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn668\" (UniqueName: \"kubernetes.io/projected/c7925532-6ff1-4914-84b0-8206d0ad5225-kube-api-access-dn668\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.122872 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7925532-6ff1-4914-84b0-8206d0ad5225-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.122892 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.122913 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-scripts\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.124003 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7925532-6ff1-4914-84b0-8206d0ad5225-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.128012 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.128220 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.128626 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-scripts\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.129275 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7925532-6ff1-4914-84b0-8206d0ad5225-config-data\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.151814 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn668\" (UniqueName: \"kubernetes.io/projected/c7925532-6ff1-4914-84b0-8206d0ad5225-kube-api-access-dn668\") pod \"cinder-scheduler-0\" (UID: \"c7925532-6ff1-4914-84b0-8206d0ad5225\") " pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.210109 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.537129 4735 generic.go:334] "Generic (PLEG): container finished" podID="caf22975-46d3-4339-87d4-e693baddc266" containerID="17635a0cbfdcdbf14c7c896afa961ca2f921b7899ebb651337f393fd1ba050e8" exitCode=0 Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.537214 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x5q6q" event={"ID":"caf22975-46d3-4339-87d4-e693baddc266","Type":"ContainerDied","Data":"17635a0cbfdcdbf14c7c896afa961ca2f921b7899ebb651337f393fd1ba050e8"} Oct 01 10:35:22 crc kubenswrapper[4735]: I1001 10:35:22.639413 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 10:35:22 crc kubenswrapper[4735]: W1001 10:35:22.646166 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7925532_6ff1_4914_84b0_8206d0ad5225.slice/crio-32db20c9a2170a8886d85524743a4bcc23bc361db0cf692e109b4e8a15d87ac4 WatchSource:0}: Error finding container 32db20c9a2170a8886d85524743a4bcc23bc361db0cf692e109b4e8a15d87ac4: Status 404 returned error can't find the container with id 32db20c9a2170a8886d85524743a4bcc23bc361db0cf692e109b4e8a15d87ac4 Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.031154 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.031419 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="ceilometer-central-agent" containerID="cri-o://3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a" gracePeriod=30 Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.031552 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="ceilometer-notification-agent" containerID="cri-o://83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64" gracePeriod=30 Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.031551 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="sg-core" containerID="cri-o://aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c" gracePeriod=30 Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.031711 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="proxy-httpd" containerID="cri-o://fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563" gracePeriod=30 Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.040760 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.550173 4735 generic.go:334] "Generic (PLEG): container finished" podID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerID="fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563" exitCode=0 Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.550206 4735 generic.go:334] "Generic (PLEG): container finished" podID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerID="aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c" exitCode=2 Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.550216 4735 generic.go:334] "Generic (PLEG): container finished" podID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerID="3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a" exitCode=0 Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.550245 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"956cbf0c-5a26-476c-aed1-911c53b9f80e","Type":"ContainerDied","Data":"fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563"} Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.550313 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"956cbf0c-5a26-476c-aed1-911c53b9f80e","Type":"ContainerDied","Data":"aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c"} Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.550332 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"956cbf0c-5a26-476c-aed1-911c53b9f80e","Type":"ContainerDied","Data":"3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a"} Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.551965 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c7925532-6ff1-4914-84b0-8206d0ad5225","Type":"ContainerStarted","Data":"94c9c119ecab07ccf7f409e556f6028209f0c2477a7244bb7a61956499f23d78"} Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.552000 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c7925532-6ff1-4914-84b0-8206d0ad5225","Type":"ContainerStarted","Data":"32db20c9a2170a8886d85524743a4bcc23bc361db0cf692e109b4e8a15d87ac4"} Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.887149 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.958262 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkz7v\" (UniqueName: \"kubernetes.io/projected/caf22975-46d3-4339-87d4-e693baddc266-kube-api-access-dkz7v\") pod \"caf22975-46d3-4339-87d4-e693baddc266\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.958387 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-config-data\") pod \"caf22975-46d3-4339-87d4-e693baddc266\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.958518 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-scripts\") pod \"caf22975-46d3-4339-87d4-e693baddc266\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.958738 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-combined-ca-bundle\") pod \"caf22975-46d3-4339-87d4-e693baddc266\" (UID: \"caf22975-46d3-4339-87d4-e693baddc266\") " Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.969716 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caf22975-46d3-4339-87d4-e693baddc266-kube-api-access-dkz7v" (OuterVolumeSpecName: "kube-api-access-dkz7v") pod "caf22975-46d3-4339-87d4-e693baddc266" (UID: "caf22975-46d3-4339-87d4-e693baddc266"). InnerVolumeSpecName "kube-api-access-dkz7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.971490 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-scripts" (OuterVolumeSpecName: "scripts") pod "caf22975-46d3-4339-87d4-e693baddc266" (UID: "caf22975-46d3-4339-87d4-e693baddc266"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:23 crc kubenswrapper[4735]: I1001 10:35:23.999485 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "caf22975-46d3-4339-87d4-e693baddc266" (UID: "caf22975-46d3-4339-87d4-e693baddc266"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.002647 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-config-data" (OuterVolumeSpecName: "config-data") pod "caf22975-46d3-4339-87d4-e693baddc266" (UID: "caf22975-46d3-4339-87d4-e693baddc266"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.063332 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.063366 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.063377 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkz7v\" (UniqueName: \"kubernetes.io/projected/caf22975-46d3-4339-87d4-e693baddc266-kube-api-access-dkz7v\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.063386 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf22975-46d3-4339-87d4-e693baddc266-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.565575 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c7925532-6ff1-4914-84b0-8206d0ad5225","Type":"ContainerStarted","Data":"d81de873ed4d48feec9a3175452883c44c24147b79f59918d0989c3a58b24180"} Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.567377 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x5q6q" event={"ID":"caf22975-46d3-4339-87d4-e693baddc266","Type":"ContainerDied","Data":"f31654e62454b37b06eb185abe76eee1f24cda541bb06fa11cb1e8be72986db7"} Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.567412 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f31654e62454b37b06eb185abe76eee1f24cda541bb06fa11cb1e8be72986db7" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.567436 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x5q6q" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.597925 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.597908513 podStartE2EDuration="3.597908513s" podCreationTimestamp="2025-10-01 10:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:35:24.595657612 +0000 UTC m=+1083.288478874" watchObservedRunningTime="2025-10-01 10:35:24.597908513 +0000 UTC m=+1083.290729765" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.668947 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 10:35:24 crc kubenswrapper[4735]: E1001 10:35:24.669337 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf22975-46d3-4339-87d4-e693baddc266" containerName="nova-cell0-conductor-db-sync" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.669364 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf22975-46d3-4339-87d4-e693baddc266" containerName="nova-cell0-conductor-db-sync" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.669555 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="caf22975-46d3-4339-87d4-e693baddc266" containerName="nova-cell0-conductor-db-sync" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.670114 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.672253 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-f4xb6" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.672349 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.680901 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.779920 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxdgq\" (UniqueName: \"kubernetes.io/projected/61a477e3-4a22-4c96-bfdb-c72c65d4984c-kube-api-access-jxdgq\") pod \"nova-cell0-conductor-0\" (UID: \"61a477e3-4a22-4c96-bfdb-c72c65d4984c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.780012 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a477e3-4a22-4c96-bfdb-c72c65d4984c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"61a477e3-4a22-4c96-bfdb-c72c65d4984c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.780156 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a477e3-4a22-4c96-bfdb-c72c65d4984c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"61a477e3-4a22-4c96-bfdb-c72c65d4984c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.881879 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a477e3-4a22-4c96-bfdb-c72c65d4984c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"61a477e3-4a22-4c96-bfdb-c72c65d4984c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.882024 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a477e3-4a22-4c96-bfdb-c72c65d4984c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"61a477e3-4a22-4c96-bfdb-c72c65d4984c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.882110 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxdgq\" (UniqueName: \"kubernetes.io/projected/61a477e3-4a22-4c96-bfdb-c72c65d4984c-kube-api-access-jxdgq\") pod \"nova-cell0-conductor-0\" (UID: \"61a477e3-4a22-4c96-bfdb-c72c65d4984c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.887148 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a477e3-4a22-4c96-bfdb-c72c65d4984c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"61a477e3-4a22-4c96-bfdb-c72c65d4984c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.916545 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a477e3-4a22-4c96-bfdb-c72c65d4984c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"61a477e3-4a22-4c96-bfdb-c72c65d4984c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.928597 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxdgq\" (UniqueName: \"kubernetes.io/projected/61a477e3-4a22-4c96-bfdb-c72c65d4984c-kube-api-access-jxdgq\") pod \"nova-cell0-conductor-0\" (UID: \"61a477e3-4a22-4c96-bfdb-c72c65d4984c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:24 crc kubenswrapper[4735]: I1001 10:35:24.983931 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:25 crc kubenswrapper[4735]: W1001 10:35:25.393398 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61a477e3_4a22_4c96_bfdb_c72c65d4984c.slice/crio-6a71931a99e7c5339401e007b8c8289ba057b328c4951eeb373731aaef3979c6 WatchSource:0}: Error finding container 6a71931a99e7c5339401e007b8c8289ba057b328c4951eeb373731aaef3979c6: Status 404 returned error can't find the container with id 6a71931a99e7c5339401e007b8c8289ba057b328c4951eeb373731aaef3979c6 Oct 01 10:35:25 crc kubenswrapper[4735]: I1001 10:35:25.399381 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 10:35:25 crc kubenswrapper[4735]: I1001 10:35:25.577745 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"61a477e3-4a22-4c96-bfdb-c72c65d4984c","Type":"ContainerStarted","Data":"6a71931a99e7c5339401e007b8c8289ba057b328c4951eeb373731aaef3979c6"} Oct 01 10:35:25 crc kubenswrapper[4735]: I1001 10:35:25.660622 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.118208 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.214607 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-combined-ca-bundle\") pod \"956cbf0c-5a26-476c-aed1-911c53b9f80e\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.214744 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-config-data\") pod \"956cbf0c-5a26-476c-aed1-911c53b9f80e\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.214783 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-log-httpd\") pod \"956cbf0c-5a26-476c-aed1-911c53b9f80e\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.214866 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-run-httpd\") pod \"956cbf0c-5a26-476c-aed1-911c53b9f80e\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.214905 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-scripts\") pod \"956cbf0c-5a26-476c-aed1-911c53b9f80e\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.214946 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfdhh\" (UniqueName: \"kubernetes.io/projected/956cbf0c-5a26-476c-aed1-911c53b9f80e-kube-api-access-mfdhh\") pod \"956cbf0c-5a26-476c-aed1-911c53b9f80e\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.214969 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-sg-core-conf-yaml\") pod \"956cbf0c-5a26-476c-aed1-911c53b9f80e\" (UID: \"956cbf0c-5a26-476c-aed1-911c53b9f80e\") " Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.225714 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "956cbf0c-5a26-476c-aed1-911c53b9f80e" (UID: "956cbf0c-5a26-476c-aed1-911c53b9f80e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.226916 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-scripts" (OuterVolumeSpecName: "scripts") pod "956cbf0c-5a26-476c-aed1-911c53b9f80e" (UID: "956cbf0c-5a26-476c-aed1-911c53b9f80e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.227961 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "956cbf0c-5a26-476c-aed1-911c53b9f80e" (UID: "956cbf0c-5a26-476c-aed1-911c53b9f80e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.228415 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956cbf0c-5a26-476c-aed1-911c53b9f80e-kube-api-access-mfdhh" (OuterVolumeSpecName: "kube-api-access-mfdhh") pod "956cbf0c-5a26-476c-aed1-911c53b9f80e" (UID: "956cbf0c-5a26-476c-aed1-911c53b9f80e"). InnerVolumeSpecName "kube-api-access-mfdhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.242423 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "956cbf0c-5a26-476c-aed1-911c53b9f80e" (UID: "956cbf0c-5a26-476c-aed1-911c53b9f80e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.297886 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "956cbf0c-5a26-476c-aed1-911c53b9f80e" (UID: "956cbf0c-5a26-476c-aed1-911c53b9f80e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.318167 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.318192 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.318201 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfdhh\" (UniqueName: \"kubernetes.io/projected/956cbf0c-5a26-476c-aed1-911c53b9f80e-kube-api-access-mfdhh\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.318210 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.318218 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.318226 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/956cbf0c-5a26-476c-aed1-911c53b9f80e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.351853 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-config-data" (OuterVolumeSpecName: "config-data") pod "956cbf0c-5a26-476c-aed1-911c53b9f80e" (UID: "956cbf0c-5a26-476c-aed1-911c53b9f80e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.419413 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/956cbf0c-5a26-476c-aed1-911c53b9f80e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.592861 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"61a477e3-4a22-4c96-bfdb-c72c65d4984c","Type":"ContainerStarted","Data":"9f953b2ec890ae7044136a6c6cac9659ab6e2e67919970076150bc3ceba4029d"} Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.593002 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.595231 4735 generic.go:334] "Generic (PLEG): container finished" podID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerID="83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64" exitCode=0 Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.595268 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"956cbf0c-5a26-476c-aed1-911c53b9f80e","Type":"ContainerDied","Data":"83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64"} Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.595286 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"956cbf0c-5a26-476c-aed1-911c53b9f80e","Type":"ContainerDied","Data":"9477c000c7da190cff68a3e396b394dbe9c0d6da3b3d8f73939240f04c37c083"} Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.595302 4735 scope.go:117] "RemoveContainer" containerID="fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.595434 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.616101 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.616077139 podStartE2EDuration="2.616077139s" podCreationTimestamp="2025-10-01 10:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:35:26.605968078 +0000 UTC m=+1085.298789350" watchObservedRunningTime="2025-10-01 10:35:26.616077139 +0000 UTC m=+1085.308898411" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.628041 4735 scope.go:117] "RemoveContainer" containerID="aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.629225 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.640960 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.657708 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:26 crc kubenswrapper[4735]: E1001 10:35:26.658200 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="sg-core" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.658213 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="sg-core" Oct 01 10:35:26 crc kubenswrapper[4735]: E1001 10:35:26.658230 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="ceilometer-notification-agent" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.658236 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="ceilometer-notification-agent" Oct 01 10:35:26 crc kubenswrapper[4735]: E1001 10:35:26.658341 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="ceilometer-central-agent" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.658350 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="ceilometer-central-agent" Oct 01 10:35:26 crc kubenswrapper[4735]: E1001 10:35:26.658364 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="proxy-httpd" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.658369 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="proxy-httpd" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.658734 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="ceilometer-notification-agent" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.658750 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="proxy-httpd" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.658763 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="sg-core" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.658773 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" containerName="ceilometer-central-agent" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.670043 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.674482 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.675418 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.676858 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.678279 4735 scope.go:117] "RemoveContainer" containerID="83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.707662 4735 scope.go:117] "RemoveContainer" containerID="3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.728624 4735 scope.go:117] "RemoveContainer" containerID="fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563" Oct 01 10:35:26 crc kubenswrapper[4735]: E1001 10:35:26.729234 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563\": container with ID starting with fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563 not found: ID does not exist" containerID="fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.729280 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563"} err="failed to get container status \"fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563\": rpc error: code = NotFound desc = could not find container \"fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563\": container with ID starting with fa19469a362b3cd23a7d346742fc24da10e3724f94d16ef47c3fdeb0c37e5563 not found: ID does not exist" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.729443 4735 scope.go:117] "RemoveContainer" containerID="aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c" Oct 01 10:35:26 crc kubenswrapper[4735]: E1001 10:35:26.729951 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c\": container with ID starting with aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c not found: ID does not exist" containerID="aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.729976 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c"} err="failed to get container status \"aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c\": rpc error: code = NotFound desc = could not find container \"aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c\": container with ID starting with aec01ea696a16ac3b1ef40870c1c8685321868f53cb861627316b750b4a81c8c not found: ID does not exist" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.729994 4735 scope.go:117] "RemoveContainer" containerID="83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64" Oct 01 10:35:26 crc kubenswrapper[4735]: E1001 10:35:26.730842 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64\": container with ID starting with 83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64 not found: ID does not exist" containerID="83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.730873 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64"} err="failed to get container status \"83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64\": rpc error: code = NotFound desc = could not find container \"83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64\": container with ID starting with 83d4c5d3c5eaf5cf99360e86767df507abb380a7764f67223aa534912691ed64 not found: ID does not exist" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.730896 4735 scope.go:117] "RemoveContainer" containerID="3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a" Oct 01 10:35:26 crc kubenswrapper[4735]: E1001 10:35:26.731278 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a\": container with ID starting with 3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a not found: ID does not exist" containerID="3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.731312 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a"} err="failed to get container status \"3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a\": rpc error: code = NotFound desc = could not find container \"3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a\": container with ID starting with 3475b296d1f8ed559878c6a6189146bb7c39745e4e5ec08d66583d03cfce377a not found: ID does not exist" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.824967 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-log-httpd\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.825013 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.825127 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-run-httpd\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.825164 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.825232 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-config-data\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.825275 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gbhn\" (UniqueName: \"kubernetes.io/projected/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-kube-api-access-9gbhn\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.825305 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-scripts\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.926606 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.926672 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-config-data\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.926715 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gbhn\" (UniqueName: \"kubernetes.io/projected/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-kube-api-access-9gbhn\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.926742 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-scripts\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.926772 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-log-httpd\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.926790 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.926844 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-run-httpd\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.946684 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-run-httpd\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.950658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-log-httpd\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.953255 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-config-data\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.953397 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.953823 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gbhn\" (UniqueName: \"kubernetes.io/projected/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-kube-api-access-9gbhn\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.954376 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-scripts\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:26 crc kubenswrapper[4735]: I1001 10:35:26.959122 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " pod="openstack/ceilometer-0" Oct 01 10:35:27 crc kubenswrapper[4735]: I1001 10:35:27.003208 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:27 crc kubenswrapper[4735]: I1001 10:35:27.210485 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 10:35:27 crc kubenswrapper[4735]: I1001 10:35:27.468207 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:27 crc kubenswrapper[4735]: W1001 10:35:27.474205 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17985564_e3a2_4dd2_8d6a_a6bdc9cb2f33.slice/crio-4e87edfb974cd75f1b2ed74017015bf272147423b9bb3585a151d95761a0ef6a WatchSource:0}: Error finding container 4e87edfb974cd75f1b2ed74017015bf272147423b9bb3585a151d95761a0ef6a: Status 404 returned error can't find the container with id 4e87edfb974cd75f1b2ed74017015bf272147423b9bb3585a151d95761a0ef6a Oct 01 10:35:27 crc kubenswrapper[4735]: I1001 10:35:27.607369 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33","Type":"ContainerStarted","Data":"4e87edfb974cd75f1b2ed74017015bf272147423b9bb3585a151d95761a0ef6a"} Oct 01 10:35:27 crc kubenswrapper[4735]: I1001 10:35:27.918135 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956cbf0c-5a26-476c-aed1-911c53b9f80e" path="/var/lib/kubelet/pods/956cbf0c-5a26-476c-aed1-911c53b9f80e/volumes" Oct 01 10:35:28 crc kubenswrapper[4735]: I1001 10:35:28.616782 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33","Type":"ContainerStarted","Data":"9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302"} Oct 01 10:35:28 crc kubenswrapper[4735]: I1001 10:35:28.772645 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 10:35:28 crc kubenswrapper[4735]: I1001 10:35:28.773095 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e1850a74-906c-4ee8-aef6-1e5e32661ac6" containerName="kube-state-metrics" containerID="cri-o://f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9" gracePeriod=30 Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.219287 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.381135 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wpwh\" (UniqueName: \"kubernetes.io/projected/e1850a74-906c-4ee8-aef6-1e5e32661ac6-kube-api-access-5wpwh\") pod \"e1850a74-906c-4ee8-aef6-1e5e32661ac6\" (UID: \"e1850a74-906c-4ee8-aef6-1e5e32661ac6\") " Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.385121 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1850a74-906c-4ee8-aef6-1e5e32661ac6-kube-api-access-5wpwh" (OuterVolumeSpecName: "kube-api-access-5wpwh") pod "e1850a74-906c-4ee8-aef6-1e5e32661ac6" (UID: "e1850a74-906c-4ee8-aef6-1e5e32661ac6"). InnerVolumeSpecName "kube-api-access-5wpwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.485042 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wpwh\" (UniqueName: \"kubernetes.io/projected/e1850a74-906c-4ee8-aef6-1e5e32661ac6-kube-api-access-5wpwh\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.629512 4735 generic.go:334] "Generic (PLEG): container finished" podID="e1850a74-906c-4ee8-aef6-1e5e32661ac6" containerID="f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9" exitCode=2 Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.629634 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.629656 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e1850a74-906c-4ee8-aef6-1e5e32661ac6","Type":"ContainerDied","Data":"f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9"} Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.629901 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e1850a74-906c-4ee8-aef6-1e5e32661ac6","Type":"ContainerDied","Data":"ed31792874cf0ee05ef0b2b0d0fbc07a78c965fc43eb52d8c717c6899e4f23d6"} Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.629934 4735 scope.go:117] "RemoveContainer" containerID="f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.634417 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33","Type":"ContainerStarted","Data":"b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f"} Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.650940 4735 scope.go:117] "RemoveContainer" containerID="f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9" Oct 01 10:35:29 crc kubenswrapper[4735]: E1001 10:35:29.654748 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9\": container with ID starting with f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9 not found: ID does not exist" containerID="f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.654794 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9"} err="failed to get container status \"f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9\": rpc error: code = NotFound desc = could not find container \"f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9\": container with ID starting with f7cf2fe59085ce7bfe6f71db89e3a2cdad8e64220ab3c1df0c167bd562074af9 not found: ID does not exist" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.681981 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.689015 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.709649 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 10:35:29 crc kubenswrapper[4735]: E1001 10:35:29.710134 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1850a74-906c-4ee8-aef6-1e5e32661ac6" containerName="kube-state-metrics" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.710154 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1850a74-906c-4ee8-aef6-1e5e32661ac6" containerName="kube-state-metrics" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.710402 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1850a74-906c-4ee8-aef6-1e5e32661ac6" containerName="kube-state-metrics" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.711203 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.716920 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.717796 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.730829 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.789874 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f9f14b82-f708-409b-94cb-34b6863dc8cc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.789955 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2llm\" (UniqueName: \"kubernetes.io/projected/f9f14b82-f708-409b-94cb-34b6863dc8cc-kube-api-access-b2llm\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.789997 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f14b82-f708-409b-94cb-34b6863dc8cc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.790047 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9f14b82-f708-409b-94cb-34b6863dc8cc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.891390 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f9f14b82-f708-409b-94cb-34b6863dc8cc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.892324 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2llm\" (UniqueName: \"kubernetes.io/projected/f9f14b82-f708-409b-94cb-34b6863dc8cc-kube-api-access-b2llm\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.892439 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f14b82-f708-409b-94cb-34b6863dc8cc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.892561 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9f14b82-f708-409b-94cb-34b6863dc8cc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.895151 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f9f14b82-f708-409b-94cb-34b6863dc8cc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.896796 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9f14b82-f708-409b-94cb-34b6863dc8cc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.897338 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f14b82-f708-409b-94cb-34b6863dc8cc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.911668 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1850a74-906c-4ee8-aef6-1e5e32661ac6" path="/var/lib/kubelet/pods/e1850a74-906c-4ee8-aef6-1e5e32661ac6/volumes" Oct 01 10:35:29 crc kubenswrapper[4735]: I1001 10:35:29.914220 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2llm\" (UniqueName: \"kubernetes.io/projected/f9f14b82-f708-409b-94cb-34b6863dc8cc-kube-api-access-b2llm\") pod \"kube-state-metrics-0\" (UID: \"f9f14b82-f708-409b-94cb-34b6863dc8cc\") " pod="openstack/kube-state-metrics-0" Oct 01 10:35:30 crc kubenswrapper[4735]: I1001 10:35:30.036856 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 10:35:30 crc kubenswrapper[4735]: I1001 10:35:30.486711 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 10:35:30 crc kubenswrapper[4735]: W1001 10:35:30.493319 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9f14b82_f708_409b_94cb_34b6863dc8cc.slice/crio-65767a227d8025186f8a654df0e686a8082cd7b06753a4431eac2da7bcc9340d WatchSource:0}: Error finding container 65767a227d8025186f8a654df0e686a8082cd7b06753a4431eac2da7bcc9340d: Status 404 returned error can't find the container with id 65767a227d8025186f8a654df0e686a8082cd7b06753a4431eac2da7bcc9340d Oct 01 10:35:30 crc kubenswrapper[4735]: I1001 10:35:30.616874 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:30 crc kubenswrapper[4735]: I1001 10:35:30.642431 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33","Type":"ContainerStarted","Data":"b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064"} Oct 01 10:35:30 crc kubenswrapper[4735]: I1001 10:35:30.643902 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f9f14b82-f708-409b-94cb-34b6863dc8cc","Type":"ContainerStarted","Data":"65767a227d8025186f8a654df0e686a8082cd7b06753a4431eac2da7bcc9340d"} Oct 01 10:35:31 crc kubenswrapper[4735]: I1001 10:35:31.655255 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f9f14b82-f708-409b-94cb-34b6863dc8cc","Type":"ContainerStarted","Data":"6726a8e5b4d5ffa953ce3b3dcf4690984f937004fbdbf9a835af6790c79b3f74"} Oct 01 10:35:31 crc kubenswrapper[4735]: I1001 10:35:31.656030 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 10:35:31 crc kubenswrapper[4735]: I1001 10:35:31.658747 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33","Type":"ContainerStarted","Data":"5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474"} Oct 01 10:35:31 crc kubenswrapper[4735]: I1001 10:35:31.658910 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 10:35:31 crc kubenswrapper[4735]: I1001 10:35:31.658894 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="ceilometer-central-agent" containerID="cri-o://9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302" gracePeriod=30 Oct 01 10:35:31 crc kubenswrapper[4735]: I1001 10:35:31.658972 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="sg-core" containerID="cri-o://b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064" gracePeriod=30 Oct 01 10:35:31 crc kubenswrapper[4735]: I1001 10:35:31.658995 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="ceilometer-notification-agent" containerID="cri-o://b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f" gracePeriod=30 Oct 01 10:35:31 crc kubenswrapper[4735]: I1001 10:35:31.659056 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="proxy-httpd" containerID="cri-o://5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474" gracePeriod=30 Oct 01 10:35:31 crc kubenswrapper[4735]: I1001 10:35:31.676847 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.2924970399999998 podStartE2EDuration="2.676831504s" podCreationTimestamp="2025-10-01 10:35:29 +0000 UTC" firstStartedPulling="2025-10-01 10:35:30.495590151 +0000 UTC m=+1089.188411413" lastFinishedPulling="2025-10-01 10:35:30.879924615 +0000 UTC m=+1089.572745877" observedRunningTime="2025-10-01 10:35:31.674586714 +0000 UTC m=+1090.367408006" watchObservedRunningTime="2025-10-01 10:35:31.676831504 +0000 UTC m=+1090.369652786" Oct 01 10:35:31 crc kubenswrapper[4735]: I1001 10:35:31.713914 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.139043974 podStartE2EDuration="5.71389645s" podCreationTimestamp="2025-10-01 10:35:26 +0000 UTC" firstStartedPulling="2025-10-01 10:35:27.477853191 +0000 UTC m=+1086.170674443" lastFinishedPulling="2025-10-01 10:35:31.052705657 +0000 UTC m=+1089.745526919" observedRunningTime="2025-10-01 10:35:31.709616435 +0000 UTC m=+1090.402437707" watchObservedRunningTime="2025-10-01 10:35:31.71389645 +0000 UTC m=+1090.406717712" Oct 01 10:35:32 crc kubenswrapper[4735]: I1001 10:35:32.405289 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 10:35:32 crc kubenswrapper[4735]: I1001 10:35:32.670624 4735 generic.go:334] "Generic (PLEG): container finished" podID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerID="5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474" exitCode=0 Oct 01 10:35:32 crc kubenswrapper[4735]: I1001 10:35:32.670971 4735 generic.go:334] "Generic (PLEG): container finished" podID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerID="b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064" exitCode=2 Oct 01 10:35:32 crc kubenswrapper[4735]: I1001 10:35:32.670984 4735 generic.go:334] "Generic (PLEG): container finished" podID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerID="b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f" exitCode=0 Oct 01 10:35:32 crc kubenswrapper[4735]: I1001 10:35:32.671601 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33","Type":"ContainerDied","Data":"5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474"} Oct 01 10:35:32 crc kubenswrapper[4735]: I1001 10:35:32.671630 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33","Type":"ContainerDied","Data":"b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064"} Oct 01 10:35:32 crc kubenswrapper[4735]: I1001 10:35:32.671643 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33","Type":"ContainerDied","Data":"b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f"} Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.009014 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.156035 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.280770 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gbhn\" (UniqueName: \"kubernetes.io/projected/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-kube-api-access-9gbhn\") pod \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.280848 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-sg-core-conf-yaml\") pod \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.280932 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-config-data\") pod \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.280981 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-run-httpd\") pod \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.281045 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-scripts\") pod \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.281213 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-combined-ca-bundle\") pod \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.281265 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-log-httpd\") pod \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\" (UID: \"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33\") " Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.282829 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" (UID: "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.283422 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" (UID: "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.303684 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-scripts" (OuterVolumeSpecName: "scripts") pod "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" (UID: "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.303811 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-kube-api-access-9gbhn" (OuterVolumeSpecName: "kube-api-access-9gbhn") pod "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" (UID: "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33"). InnerVolumeSpecName "kube-api-access-9gbhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.322260 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" (UID: "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.383411 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.383455 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gbhn\" (UniqueName: \"kubernetes.io/projected/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-kube-api-access-9gbhn\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.383474 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.383487 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.383514 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.387339 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" (UID: "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.410259 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-config-data" (OuterVolumeSpecName: "config-data") pod "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" (UID: "17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.466158 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-hwtzm"] Oct 01 10:35:35 crc kubenswrapper[4735]: E1001 10:35:35.466639 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="sg-core" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.466659 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="sg-core" Oct 01 10:35:35 crc kubenswrapper[4735]: E1001 10:35:35.466680 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="ceilometer-central-agent" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.466688 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="ceilometer-central-agent" Oct 01 10:35:35 crc kubenswrapper[4735]: E1001 10:35:35.466707 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="proxy-httpd" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.466715 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="proxy-httpd" Oct 01 10:35:35 crc kubenswrapper[4735]: E1001 10:35:35.466732 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="ceilometer-notification-agent" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.466740 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="ceilometer-notification-agent" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.466964 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="sg-core" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.466988 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="ceilometer-notification-agent" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.467002 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="proxy-httpd" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.467026 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerName="ceilometer-central-agent" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.467729 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.470046 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.470211 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.484875 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.484897 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.484930 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hwtzm"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.589624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.589764 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnsl4\" (UniqueName: \"kubernetes.io/projected/199d79d9-6c17-4d68-af5f-623ab4ceb059-kube-api-access-lnsl4\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.589922 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-config-data\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.590009 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-scripts\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.669248 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.670587 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.682623 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.687055 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.691195 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-config-data\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.691280 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-scripts\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.691301 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.691362 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnsl4\" (UniqueName: \"kubernetes.io/projected/199d79d9-6c17-4d68-af5f-623ab4ceb059-kube-api-access-lnsl4\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.697487 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-scripts\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.697610 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-config-data\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.697615 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.709585 4735 generic.go:334] "Generic (PLEG): container finished" podID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" containerID="9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302" exitCode=0 Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.709648 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33","Type":"ContainerDied","Data":"9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302"} Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.709681 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33","Type":"ContainerDied","Data":"4e87edfb974cd75f1b2ed74017015bf272147423b9bb3585a151d95761a0ef6a"} Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.709698 4735 scope.go:117] "RemoveContainer" containerID="5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.709824 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.718655 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnsl4\" (UniqueName: \"kubernetes.io/projected/199d79d9-6c17-4d68-af5f-623ab4ceb059-kube-api-access-lnsl4\") pod \"nova-cell0-cell-mapping-hwtzm\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.740653 4735 scope.go:117] "RemoveContainer" containerID="b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.748329 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.752365 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.756799 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.780588 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.789011 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.794388 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-config-data\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.794426 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxh9z\" (UniqueName: \"kubernetes.io/projected/b9e6634b-3e22-4f35-a3e6-59053bb806fc-kube-api-access-zxh9z\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.794446 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.794480 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e6634b-3e22-4f35-a3e6-59053bb806fc-logs\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.805547 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.806992 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.814010 4735 scope.go:117] "RemoveContainer" containerID="b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.820540 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.827300 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.859509 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.890157 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.908009 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e6634b-3e22-4f35-a3e6-59053bb806fc-logs\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.908077 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66442e52-6cd9-44d4-a77f-9d67f83d4d94-logs\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.908122 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-config-data\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.908172 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-config-data\") pod \"nova-scheduler-0\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.908341 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnxsr\" (UniqueName: \"kubernetes.io/projected/1cd7d42f-bcb3-49a7-aef2-1372a983e375-kube-api-access-vnxsr\") pod \"nova-scheduler-0\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.908390 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvn6c\" (UniqueName: \"kubernetes.io/projected/66442e52-6cd9-44d4-a77f-9d67f83d4d94-kube-api-access-kvn6c\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.908411 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.908435 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e6634b-3e22-4f35-a3e6-59053bb806fc-logs\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.909269 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-config-data\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.909304 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.909424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxh9z\" (UniqueName: \"kubernetes.io/projected/b9e6634b-3e22-4f35-a3e6-59053bb806fc-kube-api-access-zxh9z\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.909452 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.916012 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-config-data\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.921031 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.944433 4735 scope.go:117] "RemoveContainer" containerID="9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.947483 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33" path="/var/lib/kubelet/pods/17985564-e3a2-4dd2-8d6a-a6bdc9cb2f33/volumes" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.951257 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.956769 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.956797 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.959488 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.962340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxh9z\" (UniqueName: \"kubernetes.io/projected/b9e6634b-3e22-4f35-a3e6-59053bb806fc-kube-api-access-zxh9z\") pod \"nova-api-0\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " pod="openstack/nova-api-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.962936 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.963123 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.967658 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.973848 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.974118 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.982827 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:35 crc kubenswrapper[4735]: I1001 10:35:35.984017 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.002891 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-l6t6p"] Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.004646 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.012091 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66442e52-6cd9-44d4-a77f-9d67f83d4d94-logs\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.012126 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-config-data\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.012155 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-config-data\") pod \"nova-scheduler-0\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.012219 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnxsr\" (UniqueName: \"kubernetes.io/projected/1cd7d42f-bcb3-49a7-aef2-1372a983e375-kube-api-access-vnxsr\") pod \"nova-scheduler-0\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.012269 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvn6c\" (UniqueName: \"kubernetes.io/projected/66442e52-6cd9-44d4-a77f-9d67f83d4d94-kube-api-access-kvn6c\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.012288 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.012355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.012681 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66442e52-6cd9-44d4-a77f-9d67f83d4d94-logs\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.013624 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-l6t6p"] Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.016327 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-config-data\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.022123 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.024043 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.025235 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-config-data\") pod \"nova-scheduler-0\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.032458 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvn6c\" (UniqueName: \"kubernetes.io/projected/66442e52-6cd9-44d4-a77f-9d67f83d4d94-kube-api-access-kvn6c\") pod \"nova-metadata-0\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " pod="openstack/nova-metadata-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.033907 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnxsr\" (UniqueName: \"kubernetes.io/projected/1cd7d42f-bcb3-49a7-aef2-1372a983e375-kube-api-access-vnxsr\") pod \"nova-scheduler-0\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.061776 4735 scope.go:117] "RemoveContainer" containerID="5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474" Oct 01 10:35:36 crc kubenswrapper[4735]: E1001 10:35:36.070734 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474\": container with ID starting with 5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474 not found: ID does not exist" containerID="5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.070783 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474"} err="failed to get container status \"5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474\": rpc error: code = NotFound desc = could not find container \"5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474\": container with ID starting with 5d047286a99c397685ba6ef7d5fcbce2ab6839a7cb477facbbca74f92269e474 not found: ID does not exist" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.070815 4735 scope.go:117] "RemoveContainer" containerID="b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064" Oct 01 10:35:36 crc kubenswrapper[4735]: E1001 10:35:36.072922 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064\": container with ID starting with b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064 not found: ID does not exist" containerID="b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.072963 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064"} err="failed to get container status \"b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064\": rpc error: code = NotFound desc = could not find container \"b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064\": container with ID starting with b114f094a153682b9a2d11bf4cce1452774fd9c62c1e9b2535ff5748f5485064 not found: ID does not exist" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.073004 4735 scope.go:117] "RemoveContainer" containerID="b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f" Oct 01 10:35:36 crc kubenswrapper[4735]: E1001 10:35:36.073505 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f\": container with ID starting with b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f not found: ID does not exist" containerID="b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.073540 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f"} err="failed to get container status \"b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f\": rpc error: code = NotFound desc = could not find container \"b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f\": container with ID starting with b4549c156fdc204f2c8ad7957f9152e2ae08f278f46318004fbbcbdc14211d4f not found: ID does not exist" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.073558 4735 scope.go:117] "RemoveContainer" containerID="9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302" Oct 01 10:35:36 crc kubenswrapper[4735]: E1001 10:35:36.074431 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302\": container with ID starting with 9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302 not found: ID does not exist" containerID="9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.074471 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302"} err="failed to get container status \"9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302\": rpc error: code = NotFound desc = could not find container \"9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302\": container with ID starting with 9219270a289fbed6e235d8673cc2fc9acef5b4d1e4476d8d3bb9034757f85302 not found: ID does not exist" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114172 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114419 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114443 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjqx6\" (UniqueName: \"kubernetes.io/projected/9ae06196-4040-40db-9dd1-2f4a7c1f616c-kube-api-access-tjqx6\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114478 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-log-httpd\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114507 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-scripts\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114549 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdjjr\" (UniqueName: \"kubernetes.io/projected/be2afa25-77bd-421d-b0c6-67a3d31b642f-kube-api-access-hdjjr\") pod \"nova-cell1-novncproxy-0\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114569 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114588 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114604 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114619 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114635 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-config\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114665 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-run-httpd\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114688 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114709 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-config-data\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114736 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114756 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c92j7\" (UniqueName: \"kubernetes.io/projected/7557d9f9-d00b-4692-95e1-1bdd819aab0c-kube-api-access-c92j7\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.114781 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.164896 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.175818 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216017 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216058 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c92j7\" (UniqueName: \"kubernetes.io/projected/7557d9f9-d00b-4692-95e1-1bdd819aab0c-kube-api-access-c92j7\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216085 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216109 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216142 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216158 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjqx6\" (UniqueName: \"kubernetes.io/projected/9ae06196-4040-40db-9dd1-2f4a7c1f616c-kube-api-access-tjqx6\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216188 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-log-httpd\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216203 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-scripts\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216243 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdjjr\" (UniqueName: \"kubernetes.io/projected/be2afa25-77bd-421d-b0c6-67a3d31b642f-kube-api-access-hdjjr\") pod \"nova-cell1-novncproxy-0\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216282 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216306 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216337 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216351 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-config\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216381 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-run-httpd\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216404 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.216424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-config-data\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.218261 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.218948 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-config\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.219230 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-run-httpd\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.222017 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.222119 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.222245 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-log-httpd\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.222558 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.224561 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-config-data\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.230362 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.231350 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-scripts\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.232595 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.233056 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.233945 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.234244 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c92j7\" (UniqueName: \"kubernetes.io/projected/7557d9f9-d00b-4692-95e1-1bdd819aab0c-kube-api-access-c92j7\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.237020 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjqx6\" (UniqueName: \"kubernetes.io/projected/9ae06196-4040-40db-9dd1-2f4a7c1f616c-kube-api-access-tjqx6\") pod \"dnsmasq-dns-845d6d6f59-l6t6p\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.247571 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdjjr\" (UniqueName: \"kubernetes.io/projected/be2afa25-77bd-421d-b0c6-67a3d31b642f-kube-api-access-hdjjr\") pod \"nova-cell1-novncproxy-0\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.249834 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.355399 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.372683 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.381067 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.494616 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hwtzm"] Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.537079 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2dqx7"] Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.543831 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.550686 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2dqx7"] Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.551184 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.556980 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.560173 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.636080 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpl8d\" (UniqueName: \"kubernetes.io/projected/34d8e7eb-d10e-458f-ae05-e9c73c29b604-kube-api-access-hpl8d\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.636157 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-config-data\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.636188 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.636300 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-scripts\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.738561 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-scripts\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.738673 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpl8d\" (UniqueName: \"kubernetes.io/projected/34d8e7eb-d10e-458f-ae05-e9c73c29b604-kube-api-access-hpl8d\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.738711 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-config-data\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.738734 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.745507 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.750729 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9e6634b-3e22-4f35-a3e6-59053bb806fc","Type":"ContainerStarted","Data":"40fa10a79749142e21e12e6262ec8629c59fdd4b4d2ac86ea6f7f30ae7fcec60"} Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.766994 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-scripts\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.771333 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.775262 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-config-data\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.783147 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpl8d\" (UniqueName: \"kubernetes.io/projected/34d8e7eb-d10e-458f-ae05-e9c73c29b604-kube-api-access-hpl8d\") pod \"nova-cell1-conductor-db-sync-2dqx7\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.793614 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hwtzm" event={"ID":"199d79d9-6c17-4d68-af5f-623ab4ceb059","Type":"ContainerStarted","Data":"59210eb4b3868e9cf4858d87040d89f75b89947a45b56fb464a4914965f48778"} Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.802157 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:35:36 crc kubenswrapper[4735]: I1001 10:35:36.878798 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.116786 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.215661 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.234370 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-l6t6p"] Oct 01 10:35:37 crc kubenswrapper[4735]: W1001 10:35:37.235991 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ae06196_4040_40db_9dd1_2f4a7c1f616c.slice/crio-11b76b8ee8009b0ff07204778d44ca7ae6d03274afa2f240de1a8938c1b7964b WatchSource:0}: Error finding container 11b76b8ee8009b0ff07204778d44ca7ae6d03274afa2f240de1a8938c1b7964b: Status 404 returned error can't find the container with id 11b76b8ee8009b0ff07204778d44ca7ae6d03274afa2f240de1a8938c1b7964b Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.365039 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2dqx7"] Oct 01 10:35:37 crc kubenswrapper[4735]: W1001 10:35:37.369362 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34d8e7eb_d10e_458f_ae05_e9c73c29b604.slice/crio-331490ace23a0cab9b4059c7aff8dd99fd1dd96f7efa04208b7420d41a72fb5b WatchSource:0}: Error finding container 331490ace23a0cab9b4059c7aff8dd99fd1dd96f7efa04208b7420d41a72fb5b: Status 404 returned error can't find the container with id 331490ace23a0cab9b4059c7aff8dd99fd1dd96f7efa04208b7420d41a72fb5b Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.811027 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"be2afa25-77bd-421d-b0c6-67a3d31b642f","Type":"ContainerStarted","Data":"f6e55e93e5b330776f9b3f61a4efd9d8f37592f2acb280ed18eafb2417937d1c"} Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.816256 4735 generic.go:334] "Generic (PLEG): container finished" podID="9ae06196-4040-40db-9dd1-2f4a7c1f616c" containerID="008f508382ea043136330aa3d2f98a5a30ff05ccd4ebe25d58be351e0d0bbffe" exitCode=0 Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.816346 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" event={"ID":"9ae06196-4040-40db-9dd1-2f4a7c1f616c","Type":"ContainerDied","Data":"008f508382ea043136330aa3d2f98a5a30ff05ccd4ebe25d58be351e0d0bbffe"} Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.816373 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" event={"ID":"9ae06196-4040-40db-9dd1-2f4a7c1f616c","Type":"ContainerStarted","Data":"11b76b8ee8009b0ff07204778d44ca7ae6d03274afa2f240de1a8938c1b7964b"} Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.818245 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hwtzm" event={"ID":"199d79d9-6c17-4d68-af5f-623ab4ceb059","Type":"ContainerStarted","Data":"6d0079480629a7712148c34edf9c8b66c25b76d789d745048a3e3e522135452c"} Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.820045 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1cd7d42f-bcb3-49a7-aef2-1372a983e375","Type":"ContainerStarted","Data":"93fe1e8ef8d8ed3927b41fe955a5a4af12d52a6570c8e6665ef2901b39fc4389"} Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.821385 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66442e52-6cd9-44d4-a77f-9d67f83d4d94","Type":"ContainerStarted","Data":"b72e43986118fc2acd392046153af60addab06008ecf87128a1d1f19174fc565"} Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.824199 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7557d9f9-d00b-4692-95e1-1bdd819aab0c","Type":"ContainerStarted","Data":"44665426dde9a835439f6fb8634453468f3f3a1432a33ca89acc4c299260e5cf"} Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.843282 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2dqx7" event={"ID":"34d8e7eb-d10e-458f-ae05-e9c73c29b604","Type":"ContainerStarted","Data":"cdbbe339009129f1757f6cc96124ca7315285304666ccec60b42445640c4dd14"} Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.843543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2dqx7" event={"ID":"34d8e7eb-d10e-458f-ae05-e9c73c29b604","Type":"ContainerStarted","Data":"331490ace23a0cab9b4059c7aff8dd99fd1dd96f7efa04208b7420d41a72fb5b"} Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.870839 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-hwtzm" podStartSLOduration=2.870821962 podStartE2EDuration="2.870821962s" podCreationTimestamp="2025-10-01 10:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:35:37.850913657 +0000 UTC m=+1096.543734919" watchObservedRunningTime="2025-10-01 10:35:37.870821962 +0000 UTC m=+1096.563643224" Oct 01 10:35:37 crc kubenswrapper[4735]: I1001 10:35:37.877375 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2dqx7" podStartSLOduration=1.877361258 podStartE2EDuration="1.877361258s" podCreationTimestamp="2025-10-01 10:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:35:37.864913064 +0000 UTC m=+1096.557734326" watchObservedRunningTime="2025-10-01 10:35:37.877361258 +0000 UTC m=+1096.570182520" Oct 01 10:35:38 crc kubenswrapper[4735]: I1001 10:35:38.878222 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7557d9f9-d00b-4692-95e1-1bdd819aab0c","Type":"ContainerStarted","Data":"82c84424392ab7e34ff96df026ef17b268a71cf33f4aa6c9c82074ad93dae7ed"} Oct 01 10:35:38 crc kubenswrapper[4735]: I1001 10:35:38.885214 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" event={"ID":"9ae06196-4040-40db-9dd1-2f4a7c1f616c","Type":"ContainerStarted","Data":"b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df"} Oct 01 10:35:38 crc kubenswrapper[4735]: I1001 10:35:38.885624 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:38 crc kubenswrapper[4735]: I1001 10:35:38.911820 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" podStartSLOduration=3.911808038 podStartE2EDuration="3.911808038s" podCreationTimestamp="2025-10-01 10:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:35:38.909954579 +0000 UTC m=+1097.602775841" watchObservedRunningTime="2025-10-01 10:35:38.911808038 +0000 UTC m=+1097.604629290" Oct 01 10:35:39 crc kubenswrapper[4735]: I1001 10:35:39.336832 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 10:35:39 crc kubenswrapper[4735]: I1001 10:35:39.365537 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.050378 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.902193 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1cd7d42f-bcb3-49a7-aef2-1372a983e375","Type":"ContainerStarted","Data":"c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320"} Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.904219 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66442e52-6cd9-44d4-a77f-9d67f83d4d94","Type":"ContainerStarted","Data":"3ec3ef57eeb0651548c3ecea3aba543d978deb1903c04e2fc0a0fabaf0345c74"} Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.904275 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66442e52-6cd9-44d4-a77f-9d67f83d4d94","Type":"ContainerStarted","Data":"7ae56c1b4a1be80014ed2d8f1f33cf87a325e10dfc018ac6fce6cde9e673547d"} Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.904282 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="66442e52-6cd9-44d4-a77f-9d67f83d4d94" containerName="nova-metadata-log" containerID="cri-o://7ae56c1b4a1be80014ed2d8f1f33cf87a325e10dfc018ac6fce6cde9e673547d" gracePeriod=30 Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.904304 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="66442e52-6cd9-44d4-a77f-9d67f83d4d94" containerName="nova-metadata-metadata" containerID="cri-o://3ec3ef57eeb0651548c3ecea3aba543d978deb1903c04e2fc0a0fabaf0345c74" gracePeriod=30 Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.910350 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7557d9f9-d00b-4692-95e1-1bdd819aab0c","Type":"ContainerStarted","Data":"bcdf4338c51f60c8149c69f2bb0aae8f3a98e3fefe87f13720a8a25233e6382e"} Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.914211 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"be2afa25-77bd-421d-b0c6-67a3d31b642f","Type":"ContainerStarted","Data":"b39cca5427a74f9f23f708f6681caadec1c5360fb3d36e84e9dffa33ca53e252"} Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.914321 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="be2afa25-77bd-421d-b0c6-67a3d31b642f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b39cca5427a74f9f23f708f6681caadec1c5360fb3d36e84e9dffa33ca53e252" gracePeriod=30 Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.917938 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9e6634b-3e22-4f35-a3e6-59053bb806fc","Type":"ContainerStarted","Data":"6eb919ed4af819d7a93df15fe34234f3dcbeaaafcca08647f348d6fff027f4b9"} Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.917968 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9e6634b-3e22-4f35-a3e6-59053bb806fc","Type":"ContainerStarted","Data":"78a38140c4d9ed2d0869d7d87e4ba637ec8868920d977b0c525bdfe590af68ad"} Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.945809 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.685841683 podStartE2EDuration="5.94579271s" podCreationTimestamp="2025-10-01 10:35:35 +0000 UTC" firstStartedPulling="2025-10-01 10:35:36.830437953 +0000 UTC m=+1095.523259215" lastFinishedPulling="2025-10-01 10:35:40.09038897 +0000 UTC m=+1098.783210242" observedRunningTime="2025-10-01 10:35:40.926733038 +0000 UTC m=+1099.619554310" watchObservedRunningTime="2025-10-01 10:35:40.94579271 +0000 UTC m=+1099.638613962" Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.946941 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.571647506 podStartE2EDuration="5.94693537s" podCreationTimestamp="2025-10-01 10:35:35 +0000 UTC" firstStartedPulling="2025-10-01 10:35:36.73984476 +0000 UTC m=+1095.432666022" lastFinishedPulling="2025-10-01 10:35:40.115132604 +0000 UTC m=+1098.807953886" observedRunningTime="2025-10-01 10:35:40.942660805 +0000 UTC m=+1099.635482067" watchObservedRunningTime="2025-10-01 10:35:40.94693537 +0000 UTC m=+1099.639756632" Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.962387 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.959754722 podStartE2EDuration="5.962369985s" podCreationTimestamp="2025-10-01 10:35:35 +0000 UTC" firstStartedPulling="2025-10-01 10:35:37.120540817 +0000 UTC m=+1095.813362079" lastFinishedPulling="2025-10-01 10:35:40.12315607 +0000 UTC m=+1098.815977342" observedRunningTime="2025-10-01 10:35:40.95810774 +0000 UTC m=+1099.650928992" watchObservedRunningTime="2025-10-01 10:35:40.962369985 +0000 UTC m=+1099.655191247" Oct 01 10:35:40 crc kubenswrapper[4735]: I1001 10:35:40.979605 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.686569783 podStartE2EDuration="5.979583668s" podCreationTimestamp="2025-10-01 10:35:35 +0000 UTC" firstStartedPulling="2025-10-01 10:35:36.829594531 +0000 UTC m=+1095.522415793" lastFinishedPulling="2025-10-01 10:35:40.122608406 +0000 UTC m=+1098.815429678" observedRunningTime="2025-10-01 10:35:40.973138544 +0000 UTC m=+1099.665959806" watchObservedRunningTime="2025-10-01 10:35:40.979583668 +0000 UTC m=+1099.672404930" Oct 01 10:35:41 crc kubenswrapper[4735]: I1001 10:35:41.165764 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 10:35:41 crc kubenswrapper[4735]: I1001 10:35:41.176958 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 10:35:41 crc kubenswrapper[4735]: I1001 10:35:41.177007 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 10:35:41 crc kubenswrapper[4735]: I1001 10:35:41.356387 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:35:41 crc kubenswrapper[4735]: I1001 10:35:41.936115 4735 generic.go:334] "Generic (PLEG): container finished" podID="66442e52-6cd9-44d4-a77f-9d67f83d4d94" containerID="7ae56c1b4a1be80014ed2d8f1f33cf87a325e10dfc018ac6fce6cde9e673547d" exitCode=143 Oct 01 10:35:41 crc kubenswrapper[4735]: I1001 10:35:41.936460 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66442e52-6cd9-44d4-a77f-9d67f83d4d94","Type":"ContainerDied","Data":"7ae56c1b4a1be80014ed2d8f1f33cf87a325e10dfc018ac6fce6cde9e673547d"} Oct 01 10:35:41 crc kubenswrapper[4735]: I1001 10:35:41.942957 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7557d9f9-d00b-4692-95e1-1bdd819aab0c","Type":"ContainerStarted","Data":"1dfbf3599f5781c52cbb193ad097c3deb8c478248084dd9831b650b227a964d3"} Oct 01 10:35:42 crc kubenswrapper[4735]: I1001 10:35:42.952618 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7557d9f9-d00b-4692-95e1-1bdd819aab0c","Type":"ContainerStarted","Data":"11259723491c0a425c7bd72639e41d4500cab64791f279aabc736af27d3049ce"} Oct 01 10:35:42 crc kubenswrapper[4735]: I1001 10:35:42.975733 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.62318459 podStartE2EDuration="7.975710143s" podCreationTimestamp="2025-10-01 10:35:35 +0000 UTC" firstStartedPulling="2025-10-01 10:35:37.219223408 +0000 UTC m=+1095.912044670" lastFinishedPulling="2025-10-01 10:35:42.571748961 +0000 UTC m=+1101.264570223" observedRunningTime="2025-10-01 10:35:42.970187575 +0000 UTC m=+1101.663008857" watchObservedRunningTime="2025-10-01 10:35:42.975710143 +0000 UTC m=+1101.668531405" Oct 01 10:35:43 crc kubenswrapper[4735]: I1001 10:35:43.966925 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 10:35:44 crc kubenswrapper[4735]: I1001 10:35:44.991363 4735 generic.go:334] "Generic (PLEG): container finished" podID="199d79d9-6c17-4d68-af5f-623ab4ceb059" containerID="6d0079480629a7712148c34edf9c8b66c25b76d789d745048a3e3e522135452c" exitCode=0 Oct 01 10:35:44 crc kubenswrapper[4735]: I1001 10:35:44.992266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hwtzm" event={"ID":"199d79d9-6c17-4d68-af5f-623ab4ceb059","Type":"ContainerDied","Data":"6d0079480629a7712148c34edf9c8b66c25b76d789d745048a3e3e522135452c"} Oct 01 10:35:45 crc kubenswrapper[4735]: I1001 10:35:45.986441 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 10:35:45 crc kubenswrapper[4735]: I1001 10:35:45.986880 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.166635 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.213636 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.374304 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.382625 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.458289 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-vbx9b"] Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.458563 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" podUID="309b0308-003c-4df0-880f-3b35d5607b1c" containerName="dnsmasq-dns" containerID="cri-o://1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7" gracePeriod=10 Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.546665 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-scripts\") pod \"199d79d9-6c17-4d68-af5f-623ab4ceb059\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.547163 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnsl4\" (UniqueName: \"kubernetes.io/projected/199d79d9-6c17-4d68-af5f-623ab4ceb059-kube-api-access-lnsl4\") pod \"199d79d9-6c17-4d68-af5f-623ab4ceb059\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.547274 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-combined-ca-bundle\") pod \"199d79d9-6c17-4d68-af5f-623ab4ceb059\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.547369 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-config-data\") pod \"199d79d9-6c17-4d68-af5f-623ab4ceb059\" (UID: \"199d79d9-6c17-4d68-af5f-623ab4ceb059\") " Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.552875 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199d79d9-6c17-4d68-af5f-623ab4ceb059-kube-api-access-lnsl4" (OuterVolumeSpecName: "kube-api-access-lnsl4") pod "199d79d9-6c17-4d68-af5f-623ab4ceb059" (UID: "199d79d9-6c17-4d68-af5f-623ab4ceb059"). InnerVolumeSpecName "kube-api-access-lnsl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.559763 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-scripts" (OuterVolumeSpecName: "scripts") pod "199d79d9-6c17-4d68-af5f-623ab4ceb059" (UID: "199d79d9-6c17-4d68-af5f-623ab4ceb059"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.577799 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-config-data" (OuterVolumeSpecName: "config-data") pod "199d79d9-6c17-4d68-af5f-623ab4ceb059" (UID: "199d79d9-6c17-4d68-af5f-623ab4ceb059"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.598765 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "199d79d9-6c17-4d68-af5f-623ab4ceb059" (UID: "199d79d9-6c17-4d68-af5f-623ab4ceb059"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.651292 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnsl4\" (UniqueName: \"kubernetes.io/projected/199d79d9-6c17-4d68-af5f-623ab4ceb059-kube-api-access-lnsl4\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.651325 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.651334 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.651341 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/199d79d9-6c17-4d68-af5f-623ab4ceb059-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:46 crc kubenswrapper[4735]: I1001 10:35:46.929015 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.009685 4735 generic.go:334] "Generic (PLEG): container finished" podID="309b0308-003c-4df0-880f-3b35d5607b1c" containerID="1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7" exitCode=0 Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.009743 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" event={"ID":"309b0308-003c-4df0-880f-3b35d5607b1c","Type":"ContainerDied","Data":"1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7"} Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.009769 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" event={"ID":"309b0308-003c-4df0-880f-3b35d5607b1c","Type":"ContainerDied","Data":"af9a914305e5f37c484378af83c2264b458861fd898a5cae2774c680766cee66"} Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.009787 4735 scope.go:117] "RemoveContainer" containerID="1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.009888 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-vbx9b" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.012092 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hwtzm" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.012088 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hwtzm" event={"ID":"199d79d9-6c17-4d68-af5f-623ab4ceb059","Type":"ContainerDied","Data":"59210eb4b3868e9cf4858d87040d89f75b89947a45b56fb464a4914965f48778"} Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.012128 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59210eb4b3868e9cf4858d87040d89f75b89947a45b56fb464a4914965f48778" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.034023 4735 scope.go:117] "RemoveContainer" containerID="a51a28404f6679b6145737c9315e52a490a6570b56f0dae57bd561f2af63f201" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.055024 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.057946 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-swift-storage-0\") pod \"309b0308-003c-4df0-880f-3b35d5607b1c\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.058297 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-sb\") pod \"309b0308-003c-4df0-880f-3b35d5607b1c\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.058399 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hfqd\" (UniqueName: \"kubernetes.io/projected/309b0308-003c-4df0-880f-3b35d5607b1c-kube-api-access-9hfqd\") pod \"309b0308-003c-4df0-880f-3b35d5607b1c\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.058481 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-svc\") pod \"309b0308-003c-4df0-880f-3b35d5607b1c\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.058727 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-config\") pod \"309b0308-003c-4df0-880f-3b35d5607b1c\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.058825 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-nb\") pod \"309b0308-003c-4df0-880f-3b35d5607b1c\" (UID: \"309b0308-003c-4df0-880f-3b35d5607b1c\") " Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.063248 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309b0308-003c-4df0-880f-3b35d5607b1c-kube-api-access-9hfqd" (OuterVolumeSpecName: "kube-api-access-9hfqd") pod "309b0308-003c-4df0-880f-3b35d5607b1c" (UID: "309b0308-003c-4df0-880f-3b35d5607b1c"). InnerVolumeSpecName "kube-api-access-9hfqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.064389 4735 scope.go:117] "RemoveContainer" containerID="1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7" Oct 01 10:35:47 crc kubenswrapper[4735]: E1001 10:35:47.064877 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7\": container with ID starting with 1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7 not found: ID does not exist" containerID="1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.064918 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7"} err="failed to get container status \"1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7\": rpc error: code = NotFound desc = could not find container \"1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7\": container with ID starting with 1ed2fe352e741f9f62e8767595db6f4bc8a0760fd5a653c23a4aacb8dd6bc1a7 not found: ID does not exist" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.064942 4735 scope.go:117] "RemoveContainer" containerID="a51a28404f6679b6145737c9315e52a490a6570b56f0dae57bd561f2af63f201" Oct 01 10:35:47 crc kubenswrapper[4735]: E1001 10:35:47.065394 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a51a28404f6679b6145737c9315e52a490a6570b56f0dae57bd561f2af63f201\": container with ID starting with a51a28404f6679b6145737c9315e52a490a6570b56f0dae57bd561f2af63f201 not found: ID does not exist" containerID="a51a28404f6679b6145737c9315e52a490a6570b56f0dae57bd561f2af63f201" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.065420 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a51a28404f6679b6145737c9315e52a490a6570b56f0dae57bd561f2af63f201"} err="failed to get container status \"a51a28404f6679b6145737c9315e52a490a6570b56f0dae57bd561f2af63f201\": rpc error: code = NotFound desc = could not find container \"a51a28404f6679b6145737c9315e52a490a6570b56f0dae57bd561f2af63f201\": container with ID starting with a51a28404f6679b6145737c9315e52a490a6570b56f0dae57bd561f2af63f201 not found: ID does not exist" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.068629 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.068661 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.112878 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "309b0308-003c-4df0-880f-3b35d5607b1c" (UID: "309b0308-003c-4df0-880f-3b35d5607b1c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.114873 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "309b0308-003c-4df0-880f-3b35d5607b1c" (UID: "309b0308-003c-4df0-880f-3b35d5607b1c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.115836 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "309b0308-003c-4df0-880f-3b35d5607b1c" (UID: "309b0308-003c-4df0-880f-3b35d5607b1c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.117106 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-config" (OuterVolumeSpecName: "config") pod "309b0308-003c-4df0-880f-3b35d5607b1c" (UID: "309b0308-003c-4df0-880f-3b35d5607b1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.118156 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "309b0308-003c-4df0-880f-3b35d5607b1c" (UID: "309b0308-003c-4df0-880f-3b35d5607b1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.161207 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hfqd\" (UniqueName: \"kubernetes.io/projected/309b0308-003c-4df0-880f-3b35d5607b1c-kube-api-access-9hfqd\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.161236 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.161245 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.161257 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.161265 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.161274 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/309b0308-003c-4df0-880f-3b35d5607b1c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.179326 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.179641 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerName="nova-api-log" containerID="cri-o://78a38140c4d9ed2d0869d7d87e4ba637ec8868920d977b0c525bdfe590af68ad" gracePeriod=30 Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.180018 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerName="nova-api-api" containerID="cri-o://6eb919ed4af819d7a93df15fe34234f3dcbeaaafcca08647f348d6fff027f4b9" gracePeriod=30 Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.382083 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-vbx9b"] Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.393061 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-vbx9b"] Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.464683 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:35:47 crc kubenswrapper[4735]: I1001 10:35:47.911219 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="309b0308-003c-4df0-880f-3b35d5607b1c" path="/var/lib/kubelet/pods/309b0308-003c-4df0-880f-3b35d5607b1c/volumes" Oct 01 10:35:48 crc kubenswrapper[4735]: I1001 10:35:48.025232 4735 generic.go:334] "Generic (PLEG): container finished" podID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerID="78a38140c4d9ed2d0869d7d87e4ba637ec8868920d977b0c525bdfe590af68ad" exitCode=143 Oct 01 10:35:48 crc kubenswrapper[4735]: I1001 10:35:48.025298 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9e6634b-3e22-4f35-a3e6-59053bb806fc","Type":"ContainerDied","Data":"78a38140c4d9ed2d0869d7d87e4ba637ec8868920d977b0c525bdfe590af68ad"} Oct 01 10:35:49 crc kubenswrapper[4735]: I1001 10:35:49.040115 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1cd7d42f-bcb3-49a7-aef2-1372a983e375" containerName="nova-scheduler-scheduler" containerID="cri-o://c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320" gracePeriod=30 Oct 01 10:35:50 crc kubenswrapper[4735]: I1001 10:35:50.053799 4735 generic.go:334] "Generic (PLEG): container finished" podID="34d8e7eb-d10e-458f-ae05-e9c73c29b604" containerID="cdbbe339009129f1757f6cc96124ca7315285304666ccec60b42445640c4dd14" exitCode=0 Oct 01 10:35:50 crc kubenswrapper[4735]: I1001 10:35:50.053900 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2dqx7" event={"ID":"34d8e7eb-d10e-458f-ae05-e9c73c29b604","Type":"ContainerDied","Data":"cdbbe339009129f1757f6cc96124ca7315285304666ccec60b42445640c4dd14"} Oct 01 10:35:51 crc kubenswrapper[4735]: E1001 10:35:51.170869 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 10:35:51 crc kubenswrapper[4735]: E1001 10:35:51.176094 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 10:35:51 crc kubenswrapper[4735]: E1001 10:35:51.182375 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 10:35:51 crc kubenswrapper[4735]: E1001 10:35:51.182528 4735 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1cd7d42f-bcb3-49a7-aef2-1372a983e375" containerName="nova-scheduler-scheduler" Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.460943 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.543269 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-combined-ca-bundle\") pod \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.543363 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-scripts\") pod \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.543381 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpl8d\" (UniqueName: \"kubernetes.io/projected/34d8e7eb-d10e-458f-ae05-e9c73c29b604-kube-api-access-hpl8d\") pod \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.543430 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-config-data\") pod \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\" (UID: \"34d8e7eb-d10e-458f-ae05-e9c73c29b604\") " Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.549391 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d8e7eb-d10e-458f-ae05-e9c73c29b604-kube-api-access-hpl8d" (OuterVolumeSpecName: "kube-api-access-hpl8d") pod "34d8e7eb-d10e-458f-ae05-e9c73c29b604" (UID: "34d8e7eb-d10e-458f-ae05-e9c73c29b604"). InnerVolumeSpecName "kube-api-access-hpl8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.550487 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-scripts" (OuterVolumeSpecName: "scripts") pod "34d8e7eb-d10e-458f-ae05-e9c73c29b604" (UID: "34d8e7eb-d10e-458f-ae05-e9c73c29b604"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.571700 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-config-data" (OuterVolumeSpecName: "config-data") pod "34d8e7eb-d10e-458f-ae05-e9c73c29b604" (UID: "34d8e7eb-d10e-458f-ae05-e9c73c29b604"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.590623 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34d8e7eb-d10e-458f-ae05-e9c73c29b604" (UID: "34d8e7eb-d10e-458f-ae05-e9c73c29b604"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.646032 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.646073 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.646088 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpl8d\" (UniqueName: \"kubernetes.io/projected/34d8e7eb-d10e-458f-ae05-e9c73c29b604-kube-api-access-hpl8d\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:51 crc kubenswrapper[4735]: I1001 10:35:51.646102 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d8e7eb-d10e-458f-ae05-e9c73c29b604-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.075554 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2dqx7" event={"ID":"34d8e7eb-d10e-458f-ae05-e9c73c29b604","Type":"ContainerDied","Data":"331490ace23a0cab9b4059c7aff8dd99fd1dd96f7efa04208b7420d41a72fb5b"} Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.075590 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="331490ace23a0cab9b4059c7aff8dd99fd1dd96f7efa04208b7420d41a72fb5b" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.075978 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2dqx7" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.156941 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 10:35:52 crc kubenswrapper[4735]: E1001 10:35:52.157826 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309b0308-003c-4df0-880f-3b35d5607b1c" containerName="init" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.157859 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="309b0308-003c-4df0-880f-3b35d5607b1c" containerName="init" Oct 01 10:35:52 crc kubenswrapper[4735]: E1001 10:35:52.157932 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d8e7eb-d10e-458f-ae05-e9c73c29b604" containerName="nova-cell1-conductor-db-sync" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.157946 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d8e7eb-d10e-458f-ae05-e9c73c29b604" containerName="nova-cell1-conductor-db-sync" Oct 01 10:35:52 crc kubenswrapper[4735]: E1001 10:35:52.157980 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199d79d9-6c17-4d68-af5f-623ab4ceb059" containerName="nova-manage" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.157992 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="199d79d9-6c17-4d68-af5f-623ab4ceb059" containerName="nova-manage" Oct 01 10:35:52 crc kubenswrapper[4735]: E1001 10:35:52.158006 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309b0308-003c-4df0-880f-3b35d5607b1c" containerName="dnsmasq-dns" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.158017 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="309b0308-003c-4df0-880f-3b35d5607b1c" containerName="dnsmasq-dns" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.158338 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="309b0308-003c-4df0-880f-3b35d5607b1c" containerName="dnsmasq-dns" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.158367 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="34d8e7eb-d10e-458f-ae05-e9c73c29b604" containerName="nova-cell1-conductor-db-sync" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.158404 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="199d79d9-6c17-4d68-af5f-623ab4ceb059" containerName="nova-manage" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.159376 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.161934 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.177623 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.259156 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5a2979-222f-459d-9c57-599ebc27167e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bf5a2979-222f-459d-9c57-599ebc27167e\") " pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.259407 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5a2979-222f-459d-9c57-599ebc27167e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bf5a2979-222f-459d-9c57-599ebc27167e\") " pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.259462 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlhrn\" (UniqueName: \"kubernetes.io/projected/bf5a2979-222f-459d-9c57-599ebc27167e-kube-api-access-nlhrn\") pod \"nova-cell1-conductor-0\" (UID: \"bf5a2979-222f-459d-9c57-599ebc27167e\") " pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.360591 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5a2979-222f-459d-9c57-599ebc27167e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bf5a2979-222f-459d-9c57-599ebc27167e\") " pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.360635 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlhrn\" (UniqueName: \"kubernetes.io/projected/bf5a2979-222f-459d-9c57-599ebc27167e-kube-api-access-nlhrn\") pod \"nova-cell1-conductor-0\" (UID: \"bf5a2979-222f-459d-9c57-599ebc27167e\") " pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.360732 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5a2979-222f-459d-9c57-599ebc27167e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bf5a2979-222f-459d-9c57-599ebc27167e\") " pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.366183 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5a2979-222f-459d-9c57-599ebc27167e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bf5a2979-222f-459d-9c57-599ebc27167e\") " pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.366819 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5a2979-222f-459d-9c57-599ebc27167e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bf5a2979-222f-459d-9c57-599ebc27167e\") " pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.379893 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlhrn\" (UniqueName: \"kubernetes.io/projected/bf5a2979-222f-459d-9c57-599ebc27167e-kube-api-access-nlhrn\") pod \"nova-cell1-conductor-0\" (UID: \"bf5a2979-222f-459d-9c57-599ebc27167e\") " pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.480814 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:52 crc kubenswrapper[4735]: I1001 10:35:52.955736 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.078721 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.085544 4735 generic.go:334] "Generic (PLEG): container finished" podID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerID="6eb919ed4af819d7a93df15fe34234f3dcbeaaafcca08647f348d6fff027f4b9" exitCode=0 Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.085630 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9e6634b-3e22-4f35-a3e6-59053bb806fc","Type":"ContainerDied","Data":"6eb919ed4af819d7a93df15fe34234f3dcbeaaafcca08647f348d6fff027f4b9"} Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.090023 4735 generic.go:334] "Generic (PLEG): container finished" podID="1cd7d42f-bcb3-49a7-aef2-1372a983e375" containerID="c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320" exitCode=0 Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.090150 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1cd7d42f-bcb3-49a7-aef2-1372a983e375","Type":"ContainerDied","Data":"c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320"} Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.090188 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.090201 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1cd7d42f-bcb3-49a7-aef2-1372a983e375","Type":"ContainerDied","Data":"93fe1e8ef8d8ed3927b41fe955a5a4af12d52a6570c8e6665ef2901b39fc4389"} Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.090217 4735 scope.go:117] "RemoveContainer" containerID="c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.095243 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bf5a2979-222f-459d-9c57-599ebc27167e","Type":"ContainerStarted","Data":"e52e428dc73cfbbe8ec7383a06fa5258cbb80a44962cb3d1de16437ec569e5ca"} Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.121784 4735 scope.go:117] "RemoveContainer" containerID="c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320" Oct 01 10:35:53 crc kubenswrapper[4735]: E1001 10:35:53.122325 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320\": container with ID starting with c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320 not found: ID does not exist" containerID="c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.122372 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320"} err="failed to get container status \"c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320\": rpc error: code = NotFound desc = could not find container \"c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320\": container with ID starting with c9da29d655aa49f32d50d08b55364eeeea48fa2e8d7fc228f0b16ecce33b4320 not found: ID does not exist" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.155505 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.191478 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-config-data\") pod \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.191605 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-combined-ca-bundle\") pod \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.191629 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnxsr\" (UniqueName: \"kubernetes.io/projected/1cd7d42f-bcb3-49a7-aef2-1372a983e375-kube-api-access-vnxsr\") pod \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\" (UID: \"1cd7d42f-bcb3-49a7-aef2-1372a983e375\") " Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.195814 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd7d42f-bcb3-49a7-aef2-1372a983e375-kube-api-access-vnxsr" (OuterVolumeSpecName: "kube-api-access-vnxsr") pod "1cd7d42f-bcb3-49a7-aef2-1372a983e375" (UID: "1cd7d42f-bcb3-49a7-aef2-1372a983e375"). InnerVolumeSpecName "kube-api-access-vnxsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.216061 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-config-data" (OuterVolumeSpecName: "config-data") pod "1cd7d42f-bcb3-49a7-aef2-1372a983e375" (UID: "1cd7d42f-bcb3-49a7-aef2-1372a983e375"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.217485 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cd7d42f-bcb3-49a7-aef2-1372a983e375" (UID: "1cd7d42f-bcb3-49a7-aef2-1372a983e375"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.292947 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-combined-ca-bundle\") pod \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.293193 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e6634b-3e22-4f35-a3e6-59053bb806fc-logs\") pod \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.293241 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxh9z\" (UniqueName: \"kubernetes.io/projected/b9e6634b-3e22-4f35-a3e6-59053bb806fc-kube-api-access-zxh9z\") pod \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.293332 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-config-data\") pod \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\" (UID: \"b9e6634b-3e22-4f35-a3e6-59053bb806fc\") " Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.293706 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9e6634b-3e22-4f35-a3e6-59053bb806fc-logs" (OuterVolumeSpecName: "logs") pod "b9e6634b-3e22-4f35-a3e6-59053bb806fc" (UID: "b9e6634b-3e22-4f35-a3e6-59053bb806fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.294084 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e6634b-3e22-4f35-a3e6-59053bb806fc-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.294110 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.294122 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd7d42f-bcb3-49a7-aef2-1372a983e375-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.294178 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnxsr\" (UniqueName: \"kubernetes.io/projected/1cd7d42f-bcb3-49a7-aef2-1372a983e375-kube-api-access-vnxsr\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.296373 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e6634b-3e22-4f35-a3e6-59053bb806fc-kube-api-access-zxh9z" (OuterVolumeSpecName: "kube-api-access-zxh9z") pod "b9e6634b-3e22-4f35-a3e6-59053bb806fc" (UID: "b9e6634b-3e22-4f35-a3e6-59053bb806fc"). InnerVolumeSpecName "kube-api-access-zxh9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.315675 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9e6634b-3e22-4f35-a3e6-59053bb806fc" (UID: "b9e6634b-3e22-4f35-a3e6-59053bb806fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.316635 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-config-data" (OuterVolumeSpecName: "config-data") pod "b9e6634b-3e22-4f35-a3e6-59053bb806fc" (UID: "b9e6634b-3e22-4f35-a3e6-59053bb806fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.396896 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.397221 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e6634b-3e22-4f35-a3e6-59053bb806fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.397320 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxh9z\" (UniqueName: \"kubernetes.io/projected/b9e6634b-3e22-4f35-a3e6-59053bb806fc-kube-api-access-zxh9z\") on node \"crc\" DevicePath \"\"" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.469193 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.480991 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.490872 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:35:53 crc kubenswrapper[4735]: E1001 10:35:53.491324 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerName="nova-api-log" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.491346 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerName="nova-api-log" Oct 01 10:35:53 crc kubenswrapper[4735]: E1001 10:35:53.491363 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerName="nova-api-api" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.491372 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerName="nova-api-api" Oct 01 10:35:53 crc kubenswrapper[4735]: E1001 10:35:53.491392 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd7d42f-bcb3-49a7-aef2-1372a983e375" containerName="nova-scheduler-scheduler" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.491402 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd7d42f-bcb3-49a7-aef2-1372a983e375" containerName="nova-scheduler-scheduler" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.491692 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerName="nova-api-api" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.491722 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd7d42f-bcb3-49a7-aef2-1372a983e375" containerName="nova-scheduler-scheduler" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.491750 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" containerName="nova-api-log" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.492470 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.495187 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.500710 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.601007 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-config-data\") pod \"nova-scheduler-0\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.601142 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m54mc\" (UniqueName: \"kubernetes.io/projected/909500e0-2864-46b8-9b4f-a234e80419f7-kube-api-access-m54mc\") pod \"nova-scheduler-0\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.601171 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.703190 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m54mc\" (UniqueName: \"kubernetes.io/projected/909500e0-2864-46b8-9b4f-a234e80419f7-kube-api-access-m54mc\") pod \"nova-scheduler-0\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.703404 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.703542 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-config-data\") pod \"nova-scheduler-0\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.707567 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.710218 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-config-data\") pod \"nova-scheduler-0\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.720614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m54mc\" (UniqueName: \"kubernetes.io/projected/909500e0-2864-46b8-9b4f-a234e80419f7-kube-api-access-m54mc\") pod \"nova-scheduler-0\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.809166 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 10:35:53 crc kubenswrapper[4735]: I1001 10:35:53.912479 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd7d42f-bcb3-49a7-aef2-1372a983e375" path="/var/lib/kubelet/pods/1cd7d42f-bcb3-49a7-aef2-1372a983e375/volumes" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.118027 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"bf5a2979-222f-459d-9c57-599ebc27167e","Type":"ContainerStarted","Data":"5cfd4e33e0cd8cc3fd097754074135e8fca44ca2c92467cbdabc869f15dbc3a3"} Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.118219 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.120881 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b9e6634b-3e22-4f35-a3e6-59053bb806fc","Type":"ContainerDied","Data":"40fa10a79749142e21e12e6262ec8629c59fdd4b4d2ac86ea6f7f30ae7fcec60"} Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.120905 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.120919 4735 scope.go:117] "RemoveContainer" containerID="6eb919ed4af819d7a93df15fe34234f3dcbeaaafcca08647f348d6fff027f4b9" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.143457 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.1434407 podStartE2EDuration="2.1434407s" podCreationTimestamp="2025-10-01 10:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:35:54.137902471 +0000 UTC m=+1112.830723733" watchObservedRunningTime="2025-10-01 10:35:54.1434407 +0000 UTC m=+1112.836261962" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.147144 4735 scope.go:117] "RemoveContainer" containerID="78a38140c4d9ed2d0869d7d87e4ba637ec8868920d977b0c525bdfe590af68ad" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.161247 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.167333 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.195440 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.197036 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.199802 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.225029 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.286844 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.318945 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8273b492-e0a2-49e5-a162-5071ab3506a3-logs\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.319067 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-config-data\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.319185 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.319412 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt5ql\" (UniqueName: \"kubernetes.io/projected/8273b492-e0a2-49e5-a162-5071ab3506a3-kube-api-access-xt5ql\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.422056 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8273b492-e0a2-49e5-a162-5071ab3506a3-logs\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.422169 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-config-data\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.422214 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.422319 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt5ql\" (UniqueName: \"kubernetes.io/projected/8273b492-e0a2-49e5-a162-5071ab3506a3-kube-api-access-xt5ql\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.422454 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8273b492-e0a2-49e5-a162-5071ab3506a3-logs\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.426203 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-config-data\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.427101 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.440319 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt5ql\" (UniqueName: \"kubernetes.io/projected/8273b492-e0a2-49e5-a162-5071ab3506a3-kube-api-access-xt5ql\") pod \"nova-api-0\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.522476 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:35:54 crc kubenswrapper[4735]: I1001 10:35:54.967182 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:35:55 crc kubenswrapper[4735]: I1001 10:35:55.138008 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"909500e0-2864-46b8-9b4f-a234e80419f7","Type":"ContainerStarted","Data":"9e8ffc005f7aff3ca002a6906c52fa45daab77abd816f8b76fc204f8b960fa9a"} Oct 01 10:35:55 crc kubenswrapper[4735]: I1001 10:35:55.138439 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"909500e0-2864-46b8-9b4f-a234e80419f7","Type":"ContainerStarted","Data":"da42ad3161f5990a55504c00a1b68b579533b99c4fce05926188821a2ecee407"} Oct 01 10:35:55 crc kubenswrapper[4735]: I1001 10:35:55.141271 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8273b492-e0a2-49e5-a162-5071ab3506a3","Type":"ContainerStarted","Data":"b809460aa8c72d3bca246d889537a1c6f85a043454a768d12dfa6db10fdb84a3"} Oct 01 10:35:55 crc kubenswrapper[4735]: I1001 10:35:55.912017 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9e6634b-3e22-4f35-a3e6-59053bb806fc" path="/var/lib/kubelet/pods/b9e6634b-3e22-4f35-a3e6-59053bb806fc/volumes" Oct 01 10:35:56 crc kubenswrapper[4735]: I1001 10:35:56.152321 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8273b492-e0a2-49e5-a162-5071ab3506a3","Type":"ContainerStarted","Data":"ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c"} Oct 01 10:35:56 crc kubenswrapper[4735]: I1001 10:35:56.152636 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8273b492-e0a2-49e5-a162-5071ab3506a3","Type":"ContainerStarted","Data":"dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883"} Oct 01 10:35:56 crc kubenswrapper[4735]: I1001 10:35:56.171865 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.171848271 podStartE2EDuration="3.171848271s" podCreationTimestamp="2025-10-01 10:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:35:55.159202637 +0000 UTC m=+1113.852023919" watchObservedRunningTime="2025-10-01 10:35:56.171848271 +0000 UTC m=+1114.864669533" Oct 01 10:35:56 crc kubenswrapper[4735]: I1001 10:35:56.173141 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.173133526 podStartE2EDuration="2.173133526s" podCreationTimestamp="2025-10-01 10:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:35:56.165235434 +0000 UTC m=+1114.858056696" watchObservedRunningTime="2025-10-01 10:35:56.173133526 +0000 UTC m=+1114.865954788" Oct 01 10:35:58 crc kubenswrapper[4735]: I1001 10:35:58.810287 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 10:36:02 crc kubenswrapper[4735]: I1001 10:36:02.511106 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 01 10:36:03 crc kubenswrapper[4735]: I1001 10:36:03.809531 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 10:36:03 crc kubenswrapper[4735]: I1001 10:36:03.861702 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 10:36:04 crc kubenswrapper[4735]: I1001 10:36:04.267691 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 10:36:04 crc kubenswrapper[4735]: I1001 10:36:04.522948 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 10:36:04 crc kubenswrapper[4735]: I1001 10:36:04.523278 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 10:36:05 crc kubenswrapper[4735]: I1001 10:36:05.605712 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 10:36:05 crc kubenswrapper[4735]: I1001 10:36:05.605722 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 10:36:06 crc kubenswrapper[4735]: I1001 10:36:06.382545 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.310406 4735 generic.go:334] "Generic (PLEG): container finished" podID="66442e52-6cd9-44d4-a77f-9d67f83d4d94" containerID="3ec3ef57eeb0651548c3ecea3aba543d978deb1903c04e2fc0a0fabaf0345c74" exitCode=137 Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.310836 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66442e52-6cd9-44d4-a77f-9d67f83d4d94","Type":"ContainerDied","Data":"3ec3ef57eeb0651548c3ecea3aba543d978deb1903c04e2fc0a0fabaf0345c74"} Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.310875 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"66442e52-6cd9-44d4-a77f-9d67f83d4d94","Type":"ContainerDied","Data":"b72e43986118fc2acd392046153af60addab06008ecf87128a1d1f19174fc565"} Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.310894 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b72e43986118fc2acd392046153af60addab06008ecf87128a1d1f19174fc565" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.313937 4735 generic.go:334] "Generic (PLEG): container finished" podID="be2afa25-77bd-421d-b0c6-67a3d31b642f" containerID="b39cca5427a74f9f23f708f6681caadec1c5360fb3d36e84e9dffa33ca53e252" exitCode=137 Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.313964 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"be2afa25-77bd-421d-b0c6-67a3d31b642f","Type":"ContainerDied","Data":"b39cca5427a74f9f23f708f6681caadec1c5360fb3d36e84e9dffa33ca53e252"} Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.385354 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.391866 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.450123 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvn6c\" (UniqueName: \"kubernetes.io/projected/66442e52-6cd9-44d4-a77f-9d67f83d4d94-kube-api-access-kvn6c\") pod \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.450536 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-combined-ca-bundle\") pod \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.450602 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdjjr\" (UniqueName: \"kubernetes.io/projected/be2afa25-77bd-421d-b0c6-67a3d31b642f-kube-api-access-hdjjr\") pod \"be2afa25-77bd-421d-b0c6-67a3d31b642f\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.450653 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66442e52-6cd9-44d4-a77f-9d67f83d4d94-logs\") pod \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.450676 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-config-data\") pod \"be2afa25-77bd-421d-b0c6-67a3d31b642f\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.450714 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-config-data\") pod \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\" (UID: \"66442e52-6cd9-44d4-a77f-9d67f83d4d94\") " Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.451026 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66442e52-6cd9-44d4-a77f-9d67f83d4d94-logs" (OuterVolumeSpecName: "logs") pod "66442e52-6cd9-44d4-a77f-9d67f83d4d94" (UID: "66442e52-6cd9-44d4-a77f-9d67f83d4d94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.451240 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-combined-ca-bundle\") pod \"be2afa25-77bd-421d-b0c6-67a3d31b642f\" (UID: \"be2afa25-77bd-421d-b0c6-67a3d31b642f\") " Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.452967 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66442e52-6cd9-44d4-a77f-9d67f83d4d94-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.456149 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66442e52-6cd9-44d4-a77f-9d67f83d4d94-kube-api-access-kvn6c" (OuterVolumeSpecName: "kube-api-access-kvn6c") pod "66442e52-6cd9-44d4-a77f-9d67f83d4d94" (UID: "66442e52-6cd9-44d4-a77f-9d67f83d4d94"). InnerVolumeSpecName "kube-api-access-kvn6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.457059 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be2afa25-77bd-421d-b0c6-67a3d31b642f-kube-api-access-hdjjr" (OuterVolumeSpecName: "kube-api-access-hdjjr") pod "be2afa25-77bd-421d-b0c6-67a3d31b642f" (UID: "be2afa25-77bd-421d-b0c6-67a3d31b642f"). InnerVolumeSpecName "kube-api-access-hdjjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.481750 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-config-data" (OuterVolumeSpecName: "config-data") pod "66442e52-6cd9-44d4-a77f-9d67f83d4d94" (UID: "66442e52-6cd9-44d4-a77f-9d67f83d4d94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.484106 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66442e52-6cd9-44d4-a77f-9d67f83d4d94" (UID: "66442e52-6cd9-44d4-a77f-9d67f83d4d94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.486507 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-config-data" (OuterVolumeSpecName: "config-data") pod "be2afa25-77bd-421d-b0c6-67a3d31b642f" (UID: "be2afa25-77bd-421d-b0c6-67a3d31b642f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.486959 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be2afa25-77bd-421d-b0c6-67a3d31b642f" (UID: "be2afa25-77bd-421d-b0c6-67a3d31b642f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.555042 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdjjr\" (UniqueName: \"kubernetes.io/projected/be2afa25-77bd-421d-b0c6-67a3d31b642f-kube-api-access-hdjjr\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.555086 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.555103 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.555119 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2afa25-77bd-421d-b0c6-67a3d31b642f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.555132 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvn6c\" (UniqueName: \"kubernetes.io/projected/66442e52-6cd9-44d4-a77f-9d67f83d4d94-kube-api-access-kvn6c\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:11 crc kubenswrapper[4735]: I1001 10:36:11.555144 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66442e52-6cd9-44d4-a77f-9d67f83d4d94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.334488 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.334650 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.334764 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"be2afa25-77bd-421d-b0c6-67a3d31b642f","Type":"ContainerDied","Data":"f6e55e93e5b330776f9b3f61a4efd9d8f37592f2acb280ed18eafb2417937d1c"} Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.334816 4735 scope.go:117] "RemoveContainer" containerID="b39cca5427a74f9f23f708f6681caadec1c5360fb3d36e84e9dffa33ca53e252" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.408197 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.428523 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.448352 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.458215 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.467858 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:36:12 crc kubenswrapper[4735]: E1001 10:36:12.468459 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be2afa25-77bd-421d-b0c6-67a3d31b642f" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.468485 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2afa25-77bd-421d-b0c6-67a3d31b642f" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 10:36:12 crc kubenswrapper[4735]: E1001 10:36:12.468522 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66442e52-6cd9-44d4-a77f-9d67f83d4d94" containerName="nova-metadata-metadata" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.468532 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66442e52-6cd9-44d4-a77f-9d67f83d4d94" containerName="nova-metadata-metadata" Oct 01 10:36:12 crc kubenswrapper[4735]: E1001 10:36:12.468557 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66442e52-6cd9-44d4-a77f-9d67f83d4d94" containerName="nova-metadata-log" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.468566 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66442e52-6cd9-44d4-a77f-9d67f83d4d94" containerName="nova-metadata-log" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.468810 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66442e52-6cd9-44d4-a77f-9d67f83d4d94" containerName="nova-metadata-metadata" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.468855 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="be2afa25-77bd-421d-b0c6-67a3d31b642f" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.468871 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66442e52-6cd9-44d4-a77f-9d67f83d4d94" containerName="nova-metadata-log" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.470331 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.472442 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.472605 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.485877 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.487462 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.489041 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.489578 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.490757 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.499597 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.513258 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.575147 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.575196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgqsg\" (UniqueName: \"kubernetes.io/projected/54e82a2b-aaf1-42bc-b424-d570d07b6830-kube-api-access-fgqsg\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.575221 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.575274 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.575414 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e82a2b-aaf1-42bc-b424-d570d07b6830-logs\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.575555 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.575592 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.575626 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls74z\" (UniqueName: \"kubernetes.io/projected/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-kube-api-access-ls74z\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.575716 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-config-data\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.575873 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.678418 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-config-data\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.678576 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.678624 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.678654 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgqsg\" (UniqueName: \"kubernetes.io/projected/54e82a2b-aaf1-42bc-b424-d570d07b6830-kube-api-access-fgqsg\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.678686 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.678731 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.678776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e82a2b-aaf1-42bc-b424-d570d07b6830-logs\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.678840 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.678863 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.678897 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls74z\" (UniqueName: \"kubernetes.io/projected/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-kube-api-access-ls74z\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.679424 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e82a2b-aaf1-42bc-b424-d570d07b6830-logs\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.685380 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.686462 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.686470 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.686994 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.687257 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.687857 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-config-data\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.688754 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.700040 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls74z\" (UniqueName: \"kubernetes.io/projected/8639e0ae-f968-4b8f-b73d-52c2aba0ad24-kube-api-access-ls74z\") pod \"nova-cell1-novncproxy-0\" (UID: \"8639e0ae-f968-4b8f-b73d-52c2aba0ad24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.709597 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgqsg\" (UniqueName: \"kubernetes.io/projected/54e82a2b-aaf1-42bc-b424-d570d07b6830-kube-api-access-fgqsg\") pod \"nova-metadata-0\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.791489 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 10:36:12 crc kubenswrapper[4735]: I1001 10:36:12.804648 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:13 crc kubenswrapper[4735]: I1001 10:36:13.256163 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:36:13 crc kubenswrapper[4735]: W1001 10:36:13.257985 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54e82a2b_aaf1_42bc_b424_d570d07b6830.slice/crio-e0c56682caf5a30646b07ab6955616ec6d035e0a86de2a6660c5bbbc7b1addd6 WatchSource:0}: Error finding container e0c56682caf5a30646b07ab6955616ec6d035e0a86de2a6660c5bbbc7b1addd6: Status 404 returned error can't find the container with id e0c56682caf5a30646b07ab6955616ec6d035e0a86de2a6660c5bbbc7b1addd6 Oct 01 10:36:13 crc kubenswrapper[4735]: W1001 10:36:13.304131 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8639e0ae_f968_4b8f_b73d_52c2aba0ad24.slice/crio-c778b796782551da931432a7dddd11e44ac3366d5f15824f3392bef06de7922d WatchSource:0}: Error finding container c778b796782551da931432a7dddd11e44ac3366d5f15824f3392bef06de7922d: Status 404 returned error can't find the container with id c778b796782551da931432a7dddd11e44ac3366d5f15824f3392bef06de7922d Oct 01 10:36:13 crc kubenswrapper[4735]: I1001 10:36:13.308725 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 10:36:13 crc kubenswrapper[4735]: I1001 10:36:13.348234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8639e0ae-f968-4b8f-b73d-52c2aba0ad24","Type":"ContainerStarted","Data":"c778b796782551da931432a7dddd11e44ac3366d5f15824f3392bef06de7922d"} Oct 01 10:36:13 crc kubenswrapper[4735]: I1001 10:36:13.349670 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54e82a2b-aaf1-42bc-b424-d570d07b6830","Type":"ContainerStarted","Data":"e0c56682caf5a30646b07ab6955616ec6d035e0a86de2a6660c5bbbc7b1addd6"} Oct 01 10:36:13 crc kubenswrapper[4735]: I1001 10:36:13.910569 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66442e52-6cd9-44d4-a77f-9d67f83d4d94" path="/var/lib/kubelet/pods/66442e52-6cd9-44d4-a77f-9d67f83d4d94/volumes" Oct 01 10:36:13 crc kubenswrapper[4735]: I1001 10:36:13.911240 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be2afa25-77bd-421d-b0c6-67a3d31b642f" path="/var/lib/kubelet/pods/be2afa25-77bd-421d-b0c6-67a3d31b642f/volumes" Oct 01 10:36:14 crc kubenswrapper[4735]: I1001 10:36:14.362820 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8639e0ae-f968-4b8f-b73d-52c2aba0ad24","Type":"ContainerStarted","Data":"5fa46d612d9f8d22226311e751c737ab325231fea1740e4918baaeb72fb1acc3"} Oct 01 10:36:14 crc kubenswrapper[4735]: I1001 10:36:14.365130 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54e82a2b-aaf1-42bc-b424-d570d07b6830","Type":"ContainerStarted","Data":"1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68"} Oct 01 10:36:14 crc kubenswrapper[4735]: I1001 10:36:14.365172 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54e82a2b-aaf1-42bc-b424-d570d07b6830","Type":"ContainerStarted","Data":"0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc"} Oct 01 10:36:14 crc kubenswrapper[4735]: I1001 10:36:14.385964 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.385940094 podStartE2EDuration="2.385940094s" podCreationTimestamp="2025-10-01 10:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:36:14.384136086 +0000 UTC m=+1133.076957378" watchObservedRunningTime="2025-10-01 10:36:14.385940094 +0000 UTC m=+1133.078761396" Oct 01 10:36:14 crc kubenswrapper[4735]: I1001 10:36:14.407405 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.40738422 podStartE2EDuration="2.40738422s" podCreationTimestamp="2025-10-01 10:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:36:14.403240279 +0000 UTC m=+1133.096061541" watchObservedRunningTime="2025-10-01 10:36:14.40738422 +0000 UTC m=+1133.100205492" Oct 01 10:36:14 crc kubenswrapper[4735]: I1001 10:36:14.527641 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 10:36:14 crc kubenswrapper[4735]: I1001 10:36:14.528330 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 10:36:14 crc kubenswrapper[4735]: I1001 10:36:14.529854 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 10:36:14 crc kubenswrapper[4735]: I1001 10:36:14.542949 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.374361 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.378288 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.550089 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-x52mp"] Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.552755 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.562834 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-x52mp"] Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.645424 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqj5f\" (UniqueName: \"kubernetes.io/projected/73183ba7-fe81-43ee-b62d-e843f83406c3-kube-api-access-lqj5f\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.645471 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.645547 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.645576 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.645804 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.646218 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-config\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.750244 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.748791 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.750724 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqj5f\" (UniqueName: \"kubernetes.io/projected/73183ba7-fe81-43ee-b62d-e843f83406c3-kube-api-access-lqj5f\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.751285 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.752480 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.752685 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.753783 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.754024 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.755137 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.756608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-config\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.756666 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-config\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.768408 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqj5f\" (UniqueName: \"kubernetes.io/projected/73183ba7-fe81-43ee-b62d-e843f83406c3-kube-api-access-lqj5f\") pod \"dnsmasq-dns-59cf4bdb65-x52mp\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:15 crc kubenswrapper[4735]: I1001 10:36:15.874624 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:16 crc kubenswrapper[4735]: I1001 10:36:16.373210 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-x52mp"] Oct 01 10:36:16 crc kubenswrapper[4735]: W1001 10:36:16.376092 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73183ba7_fe81_43ee_b62d_e843f83406c3.slice/crio-55a81045279116e784ec8701c95fc5cf5da975607170253621762dbf3fd5af7e WatchSource:0}: Error finding container 55a81045279116e784ec8701c95fc5cf5da975607170253621762dbf3fd5af7e: Status 404 returned error can't find the container with id 55a81045279116e784ec8701c95fc5cf5da975607170253621762dbf3fd5af7e Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.294757 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.295606 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="ceilometer-central-agent" containerID="cri-o://82c84424392ab7e34ff96df026ef17b268a71cf33f4aa6c9c82074ad93dae7ed" gracePeriod=30 Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.295679 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="proxy-httpd" containerID="cri-o://11259723491c0a425c7bd72639e41d4500cab64791f279aabc736af27d3049ce" gracePeriod=30 Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.295688 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="sg-core" containerID="cri-o://1dfbf3599f5781c52cbb193ad097c3deb8c478248084dd9831b650b227a964d3" gracePeriod=30 Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.295963 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="ceilometer-notification-agent" containerID="cri-o://bcdf4338c51f60c8149c69f2bb0aae8f3a98e3fefe87f13720a8a25233e6382e" gracePeriod=30 Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.399790 4735 generic.go:334] "Generic (PLEG): container finished" podID="73183ba7-fe81-43ee-b62d-e843f83406c3" containerID="8dee2661d98385c1217474cf803c0be7616076fbbe56fd12f2e678f22ed21359" exitCode=0 Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.401951 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" event={"ID":"73183ba7-fe81-43ee-b62d-e843f83406c3","Type":"ContainerDied","Data":"8dee2661d98385c1217474cf803c0be7616076fbbe56fd12f2e678f22ed21359"} Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.401993 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" event={"ID":"73183ba7-fe81-43ee-b62d-e843f83406c3","Type":"ContainerStarted","Data":"55a81045279116e784ec8701c95fc5cf5da975607170253621762dbf3fd5af7e"} Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.792061 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.792369 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.805132 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:17 crc kubenswrapper[4735]: I1001 10:36:17.909927 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:18 crc kubenswrapper[4735]: I1001 10:36:18.410132 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" event={"ID":"73183ba7-fe81-43ee-b62d-e843f83406c3","Type":"ContainerStarted","Data":"d292d2caf6866c62adfa1b621927f50848b1212cedc33c7917ac6db36be8e688"} Oct 01 10:36:18 crc kubenswrapper[4735]: I1001 10:36:18.410260 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:18 crc kubenswrapper[4735]: I1001 10:36:18.413576 4735 generic.go:334] "Generic (PLEG): container finished" podID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerID="11259723491c0a425c7bd72639e41d4500cab64791f279aabc736af27d3049ce" exitCode=0 Oct 01 10:36:18 crc kubenswrapper[4735]: I1001 10:36:18.413616 4735 generic.go:334] "Generic (PLEG): container finished" podID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerID="1dfbf3599f5781c52cbb193ad097c3deb8c478248084dd9831b650b227a964d3" exitCode=2 Oct 01 10:36:18 crc kubenswrapper[4735]: I1001 10:36:18.413630 4735 generic.go:334] "Generic (PLEG): container finished" podID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerID="82c84424392ab7e34ff96df026ef17b268a71cf33f4aa6c9c82074ad93dae7ed" exitCode=0 Oct 01 10:36:18 crc kubenswrapper[4735]: I1001 10:36:18.413627 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7557d9f9-d00b-4692-95e1-1bdd819aab0c","Type":"ContainerDied","Data":"11259723491c0a425c7bd72639e41d4500cab64791f279aabc736af27d3049ce"} Oct 01 10:36:18 crc kubenswrapper[4735]: I1001 10:36:18.413687 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7557d9f9-d00b-4692-95e1-1bdd819aab0c","Type":"ContainerDied","Data":"1dfbf3599f5781c52cbb193ad097c3deb8c478248084dd9831b650b227a964d3"} Oct 01 10:36:18 crc kubenswrapper[4735]: I1001 10:36:18.413701 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7557d9f9-d00b-4692-95e1-1bdd819aab0c","Type":"ContainerDied","Data":"82c84424392ab7e34ff96df026ef17b268a71cf33f4aa6c9c82074ad93dae7ed"} Oct 01 10:36:18 crc kubenswrapper[4735]: I1001 10:36:18.413946 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerName="nova-api-log" containerID="cri-o://dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883" gracePeriod=30 Oct 01 10:36:18 crc kubenswrapper[4735]: I1001 10:36:18.413982 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerName="nova-api-api" containerID="cri-o://ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c" gracePeriod=30 Oct 01 10:36:18 crc kubenswrapper[4735]: I1001 10:36:18.451353 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" podStartSLOduration=3.45133544 podStartE2EDuration="3.45133544s" podCreationTimestamp="2025-10-01 10:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:36:18.445272946 +0000 UTC m=+1137.138094208" watchObservedRunningTime="2025-10-01 10:36:18.45133544 +0000 UTC m=+1137.144156702" Oct 01 10:36:19 crc kubenswrapper[4735]: I1001 10:36:19.422528 4735 generic.go:334] "Generic (PLEG): container finished" podID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerID="dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883" exitCode=143 Oct 01 10:36:19 crc kubenswrapper[4735]: I1001 10:36:19.422581 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8273b492-e0a2-49e5-a162-5071ab3506a3","Type":"ContainerDied","Data":"dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883"} Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.052323 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.173810 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt5ql\" (UniqueName: \"kubernetes.io/projected/8273b492-e0a2-49e5-a162-5071ab3506a3-kube-api-access-xt5ql\") pod \"8273b492-e0a2-49e5-a162-5071ab3506a3\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.173914 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8273b492-e0a2-49e5-a162-5071ab3506a3-logs\") pod \"8273b492-e0a2-49e5-a162-5071ab3506a3\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.173978 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-combined-ca-bundle\") pod \"8273b492-e0a2-49e5-a162-5071ab3506a3\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.174220 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-config-data\") pod \"8273b492-e0a2-49e5-a162-5071ab3506a3\" (UID: \"8273b492-e0a2-49e5-a162-5071ab3506a3\") " Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.174910 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8273b492-e0a2-49e5-a162-5071ab3506a3-logs" (OuterVolumeSpecName: "logs") pod "8273b492-e0a2-49e5-a162-5071ab3506a3" (UID: "8273b492-e0a2-49e5-a162-5071ab3506a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.185968 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8273b492-e0a2-49e5-a162-5071ab3506a3-kube-api-access-xt5ql" (OuterVolumeSpecName: "kube-api-access-xt5ql") pod "8273b492-e0a2-49e5-a162-5071ab3506a3" (UID: "8273b492-e0a2-49e5-a162-5071ab3506a3"). InnerVolumeSpecName "kube-api-access-xt5ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.202967 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8273b492-e0a2-49e5-a162-5071ab3506a3" (UID: "8273b492-e0a2-49e5-a162-5071ab3506a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.217313 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-config-data" (OuterVolumeSpecName: "config-data") pod "8273b492-e0a2-49e5-a162-5071ab3506a3" (UID: "8273b492-e0a2-49e5-a162-5071ab3506a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.276795 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.276839 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt5ql\" (UniqueName: \"kubernetes.io/projected/8273b492-e0a2-49e5-a162-5071ab3506a3-kube-api-access-xt5ql\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.276851 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8273b492-e0a2-49e5-a162-5071ab3506a3-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.276862 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8273b492-e0a2-49e5-a162-5071ab3506a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.453705 4735 generic.go:334] "Generic (PLEG): container finished" podID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerID="ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c" exitCode=0 Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.453749 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8273b492-e0a2-49e5-a162-5071ab3506a3","Type":"ContainerDied","Data":"ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c"} Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.453773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8273b492-e0a2-49e5-a162-5071ab3506a3","Type":"ContainerDied","Data":"b809460aa8c72d3bca246d889537a1c6f85a043454a768d12dfa6db10fdb84a3"} Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.453788 4735 scope.go:117] "RemoveContainer" containerID="ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.453927 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.475334 4735 scope.go:117] "RemoveContainer" containerID="dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.503359 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.513545 4735 scope.go:117] "RemoveContainer" containerID="ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.514775 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:22 crc kubenswrapper[4735]: E1001 10:36:22.516890 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c\": container with ID starting with ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c not found: ID does not exist" containerID="ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.516926 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c"} err="failed to get container status \"ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c\": rpc error: code = NotFound desc = could not find container \"ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c\": container with ID starting with ad274fe08c5c87c032ee03c238f39a4e0565da78a9a515dd3b10741adbc03d2c not found: ID does not exist" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.516949 4735 scope.go:117] "RemoveContainer" containerID="dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883" Oct 01 10:36:22 crc kubenswrapper[4735]: E1001 10:36:22.517428 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883\": container with ID starting with dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883 not found: ID does not exist" containerID="dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.517449 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883"} err="failed to get container status \"dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883\": rpc error: code = NotFound desc = could not find container \"dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883\": container with ID starting with dfe3b96d98ee0727e913076a837c8baa075dd387f335b541e279ad8cef5f4883 not found: ID does not exist" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.521337 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:22 crc kubenswrapper[4735]: E1001 10:36:22.521867 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerName="nova-api-log" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.521893 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerName="nova-api-log" Oct 01 10:36:22 crc kubenswrapper[4735]: E1001 10:36:22.521940 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerName="nova-api-api" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.521950 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerName="nova-api-api" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.522176 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerName="nova-api-log" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.522202 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8273b492-e0a2-49e5-a162-5071ab3506a3" containerName="nova-api-api" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.523425 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.525741 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.525927 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.525959 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.532529 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.683481 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.683573 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-public-tls-certs\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.683595 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.683620 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-config-data\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.683668 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9653fd1a-2f70-4b24-9050-b6a32e793e68-logs\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.683720 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb4db\" (UniqueName: \"kubernetes.io/projected/9653fd1a-2f70-4b24-9050-b6a32e793e68-kube-api-access-nb4db\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.784995 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.785086 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-public-tls-certs\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.785115 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.785147 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-config-data\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.785213 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9653fd1a-2f70-4b24-9050-b6a32e793e68-logs\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.785266 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb4db\" (UniqueName: \"kubernetes.io/projected/9653fd1a-2f70-4b24-9050-b6a32e793e68-kube-api-access-nb4db\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.786216 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9653fd1a-2f70-4b24-9050-b6a32e793e68-logs\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.792115 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.792151 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-config-data\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.792121 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.792199 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-public-tls-certs\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.792246 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.792570 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.805296 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.810023 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb4db\" (UniqueName: \"kubernetes.io/projected/9653fd1a-2f70-4b24-9050-b6a32e793e68-kube-api-access-nb4db\") pod \"nova-api-0\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " pod="openstack/nova-api-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.825871 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:22 crc kubenswrapper[4735]: I1001 10:36:22.839661 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.311553 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:23 crc kubenswrapper[4735]: W1001 10:36:23.339050 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9653fd1a_2f70_4b24_9050_b6a32e793e68.slice/crio-6dcfff02e64225369215252daef389b8e334d6892cf062db3b8be71232bf1167 WatchSource:0}: Error finding container 6dcfff02e64225369215252daef389b8e334d6892cf062db3b8be71232bf1167: Status 404 returned error can't find the container with id 6dcfff02e64225369215252daef389b8e334d6892cf062db3b8be71232bf1167 Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.463605 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9653fd1a-2f70-4b24-9050-b6a32e793e68","Type":"ContainerStarted","Data":"6dcfff02e64225369215252daef389b8e334d6892cf062db3b8be71232bf1167"} Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.468315 4735 generic.go:334] "Generic (PLEG): container finished" podID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerID="bcdf4338c51f60c8149c69f2bb0aae8f3a98e3fefe87f13720a8a25233e6382e" exitCode=0 Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.468378 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7557d9f9-d00b-4692-95e1-1bdd819aab0c","Type":"ContainerDied","Data":"bcdf4338c51f60c8149c69f2bb0aae8f3a98e3fefe87f13720a8a25233e6382e"} Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.493466 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.592022 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.702379 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-log-httpd\") pod \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.702434 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-sg-core-conf-yaml\") pod \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.702464 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c92j7\" (UniqueName: \"kubernetes.io/projected/7557d9f9-d00b-4692-95e1-1bdd819aab0c-kube-api-access-c92j7\") pod \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.702555 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-scripts\") pod \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.702655 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-ceilometer-tls-certs\") pod \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.702700 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-combined-ca-bundle\") pod \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.702782 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-run-httpd\") pod \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.702801 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-config-data\") pod \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\" (UID: \"7557d9f9-d00b-4692-95e1-1bdd819aab0c\") " Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.703478 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7557d9f9-d00b-4692-95e1-1bdd819aab0c" (UID: "7557d9f9-d00b-4692-95e1-1bdd819aab0c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.704670 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7557d9f9-d00b-4692-95e1-1bdd819aab0c" (UID: "7557d9f9-d00b-4692-95e1-1bdd819aab0c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.708936 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-scripts" (OuterVolumeSpecName: "scripts") pod "7557d9f9-d00b-4692-95e1-1bdd819aab0c" (UID: "7557d9f9-d00b-4692-95e1-1bdd819aab0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.717351 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7557d9f9-d00b-4692-95e1-1bdd819aab0c-kube-api-access-c92j7" (OuterVolumeSpecName: "kube-api-access-c92j7") pod "7557d9f9-d00b-4692-95e1-1bdd819aab0c" (UID: "7557d9f9-d00b-4692-95e1-1bdd819aab0c"). InnerVolumeSpecName "kube-api-access-c92j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.728701 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-gxllh"] Oct 01 10:36:23 crc kubenswrapper[4735]: E1001 10:36:23.729083 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="sg-core" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.729095 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="sg-core" Oct 01 10:36:23 crc kubenswrapper[4735]: E1001 10:36:23.729106 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="ceilometer-notification-agent" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.729112 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="ceilometer-notification-agent" Oct 01 10:36:23 crc kubenswrapper[4735]: E1001 10:36:23.729126 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="proxy-httpd" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.729132 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="proxy-httpd" Oct 01 10:36:23 crc kubenswrapper[4735]: E1001 10:36:23.729155 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="ceilometer-central-agent" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.729160 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="ceilometer-central-agent" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.729319 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="proxy-httpd" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.729329 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="ceilometer-central-agent" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.729346 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="sg-core" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.729362 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" containerName="ceilometer-notification-agent" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.729943 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.737992 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.738148 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.742844 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gxllh"] Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.788615 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7557d9f9-d00b-4692-95e1-1bdd819aab0c" (UID: "7557d9f9-d00b-4692-95e1-1bdd819aab0c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.799649 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7557d9f9-d00b-4692-95e1-1bdd819aab0c" (UID: "7557d9f9-d00b-4692-95e1-1bdd819aab0c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.805886 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.805958 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.806583 4735 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.806604 4735 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.806615 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c92j7\" (UniqueName: \"kubernetes.io/projected/7557d9f9-d00b-4692-95e1-1bdd819aab0c-kube-api-access-c92j7\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.806623 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.806631 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.806639 4735 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7557d9f9-d00b-4692-95e1-1bdd819aab0c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.833570 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7557d9f9-d00b-4692-95e1-1bdd819aab0c" (UID: "7557d9f9-d00b-4692-95e1-1bdd819aab0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.873025 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-config-data" (OuterVolumeSpecName: "config-data") pod "7557d9f9-d00b-4692-95e1-1bdd819aab0c" (UID: "7557d9f9-d00b-4692-95e1-1bdd819aab0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.908095 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8273b492-e0a2-49e5-a162-5071ab3506a3" path="/var/lib/kubelet/pods/8273b492-e0a2-49e5-a162-5071ab3506a3/volumes" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.909645 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-config-data\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.909703 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-scripts\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.909804 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.909834 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvq4n\" (UniqueName: \"kubernetes.io/projected/b35d0ee3-23af-4661-88bc-df962b75ced3-kube-api-access-gvq4n\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.909888 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:23 crc kubenswrapper[4735]: I1001 10:36:23.909898 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7557d9f9-d00b-4692-95e1-1bdd819aab0c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.011894 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-config-data\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.012276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-scripts\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.012473 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.012615 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvq4n\" (UniqueName: \"kubernetes.io/projected/b35d0ee3-23af-4661-88bc-df962b75ced3-kube-api-access-gvq4n\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.016966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-scripts\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.019714 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-config-data\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.028150 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.033146 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvq4n\" (UniqueName: \"kubernetes.io/projected/b35d0ee3-23af-4661-88bc-df962b75ced3-kube-api-access-gvq4n\") pod \"nova-cell1-cell-mapping-gxllh\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.057409 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:24 crc kubenswrapper[4735]: W1001 10:36:24.495694 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb35d0ee3_23af_4661_88bc_df962b75ced3.slice/crio-d6e28d1df187ce380f1d456d556076070040c6b6df16db385c67cfaf1a09de47 WatchSource:0}: Error finding container d6e28d1df187ce380f1d456d556076070040c6b6df16db385c67cfaf1a09de47: Status 404 returned error can't find the container with id d6e28d1df187ce380f1d456d556076070040c6b6df16db385c67cfaf1a09de47 Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.497539 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gxllh"] Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.499333 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9653fd1a-2f70-4b24-9050-b6a32e793e68","Type":"ContainerStarted","Data":"6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0"} Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.499365 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9653fd1a-2f70-4b24-9050-b6a32e793e68","Type":"ContainerStarted","Data":"bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48"} Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.503633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7557d9f9-d00b-4692-95e1-1bdd819aab0c","Type":"ContainerDied","Data":"44665426dde9a835439f6fb8634453468f3f3a1432a33ca89acc4c299260e5cf"} Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.503738 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.503744 4735 scope.go:117] "RemoveContainer" containerID="11259723491c0a425c7bd72639e41d4500cab64791f279aabc736af27d3049ce" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.524355 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.524330839 podStartE2EDuration="2.524330839s" podCreationTimestamp="2025-10-01 10:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:36:24.521889603 +0000 UTC m=+1143.214710865" watchObservedRunningTime="2025-10-01 10:36:24.524330839 +0000 UTC m=+1143.217152111" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.526298 4735 scope.go:117] "RemoveContainer" containerID="1dfbf3599f5781c52cbb193ad097c3deb8c478248084dd9831b650b227a964d3" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.559294 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.564911 4735 scope.go:117] "RemoveContainer" containerID="bcdf4338c51f60c8149c69f2bb0aae8f3a98e3fefe87f13720a8a25233e6382e" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.581068 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.595511 4735 scope.go:117] "RemoveContainer" containerID="82c84424392ab7e34ff96df026ef17b268a71cf33f4aa6c9c82074ad93dae7ed" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.603171 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.605343 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.610236 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.610392 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.610488 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.625315 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.726336 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.726403 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.726433 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.726829 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-scripts\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.726981 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-log-httpd\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.727095 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-config-data\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.727152 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkfb9\" (UniqueName: \"kubernetes.io/projected/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-kube-api-access-rkfb9\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.727269 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-run-httpd\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.828706 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-log-httpd\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.828778 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-config-data\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.828803 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkfb9\" (UniqueName: \"kubernetes.io/projected/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-kube-api-access-rkfb9\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.828839 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-run-httpd\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.828868 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.828894 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.828907 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.829299 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-log-httpd\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.829374 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-run-httpd\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.829614 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-scripts\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.832529 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.832664 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.833397 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-scripts\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.834609 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.837459 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-config-data\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.852212 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkfb9\" (UniqueName: \"kubernetes.io/projected/181bb1ae-a923-4bbd-8d8f-e5d8c8878214-kube-api-access-rkfb9\") pod \"ceilometer-0\" (UID: \"181bb1ae-a923-4bbd-8d8f-e5d8c8878214\") " pod="openstack/ceilometer-0" Oct 01 10:36:24 crc kubenswrapper[4735]: I1001 10:36:24.937679 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 10:36:25 crc kubenswrapper[4735]: I1001 10:36:25.372880 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 10:36:25 crc kubenswrapper[4735]: W1001 10:36:25.377979 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod181bb1ae_a923_4bbd_8d8f_e5d8c8878214.slice/crio-39777453f516abda39431b0b80ac3029ac41a7e0244359ef46e1ee9a4cf5537d WatchSource:0}: Error finding container 39777453f516abda39431b0b80ac3029ac41a7e0244359ef46e1ee9a4cf5537d: Status 404 returned error can't find the container with id 39777453f516abda39431b0b80ac3029ac41a7e0244359ef46e1ee9a4cf5537d Oct 01 10:36:25 crc kubenswrapper[4735]: I1001 10:36:25.517565 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"181bb1ae-a923-4bbd-8d8f-e5d8c8878214","Type":"ContainerStarted","Data":"39777453f516abda39431b0b80ac3029ac41a7e0244359ef46e1ee9a4cf5537d"} Oct 01 10:36:25 crc kubenswrapper[4735]: I1001 10:36:25.523083 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gxllh" event={"ID":"b35d0ee3-23af-4661-88bc-df962b75ced3","Type":"ContainerStarted","Data":"f5c95e6647646ffa4b3fa47094851620a1358671915e5a25be4b2f27ad7cc190"} Oct 01 10:36:25 crc kubenswrapper[4735]: I1001 10:36:25.523152 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gxllh" event={"ID":"b35d0ee3-23af-4661-88bc-df962b75ced3","Type":"ContainerStarted","Data":"d6e28d1df187ce380f1d456d556076070040c6b6df16db385c67cfaf1a09de47"} Oct 01 10:36:25 crc kubenswrapper[4735]: I1001 10:36:25.547844 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-gxllh" podStartSLOduration=2.547822695 podStartE2EDuration="2.547822695s" podCreationTimestamp="2025-10-01 10:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:36:25.545140422 +0000 UTC m=+1144.237961694" watchObservedRunningTime="2025-10-01 10:36:25.547822695 +0000 UTC m=+1144.240643957" Oct 01 10:36:25 crc kubenswrapper[4735]: I1001 10:36:25.877624 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:36:25 crc kubenswrapper[4735]: I1001 10:36:25.935583 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7557d9f9-d00b-4692-95e1-1bdd819aab0c" path="/var/lib/kubelet/pods/7557d9f9-d00b-4692-95e1-1bdd819aab0c/volumes" Oct 01 10:36:25 crc kubenswrapper[4735]: I1001 10:36:25.975178 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-l6t6p"] Oct 01 10:36:25 crc kubenswrapper[4735]: I1001 10:36:25.975410 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" podUID="9ae06196-4040-40db-9dd1-2f4a7c1f616c" containerName="dnsmasq-dns" containerID="cri-o://b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df" gracePeriod=10 Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.390729 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.532916 4735 generic.go:334] "Generic (PLEG): container finished" podID="9ae06196-4040-40db-9dd1-2f4a7c1f616c" containerID="b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df" exitCode=0 Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.532969 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" event={"ID":"9ae06196-4040-40db-9dd1-2f4a7c1f616c","Type":"ContainerDied","Data":"b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df"} Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.533020 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" event={"ID":"9ae06196-4040-40db-9dd1-2f4a7c1f616c","Type":"ContainerDied","Data":"11b76b8ee8009b0ff07204778d44ca7ae6d03274afa2f240de1a8938c1b7964b"} Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.533047 4735 scope.go:117] "RemoveContainer" containerID="b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.533898 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-l6t6p" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.535701 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"181bb1ae-a923-4bbd-8d8f-e5d8c8878214","Type":"ContainerStarted","Data":"4d24f379ce1edd58b13927e1e0f5bdbb9a28b670967cc2d6db528f1649c104ef"} Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.558518 4735 scope.go:117] "RemoveContainer" containerID="008f508382ea043136330aa3d2f98a5a30ff05ccd4ebe25d58be351e0d0bbffe" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.571638 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-swift-storage-0\") pod \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.571710 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjqx6\" (UniqueName: \"kubernetes.io/projected/9ae06196-4040-40db-9dd1-2f4a7c1f616c-kube-api-access-tjqx6\") pod \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.571797 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-svc\") pod \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.571886 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-nb\") pod \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.571934 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-config\") pod \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.572004 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-sb\") pod \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\" (UID: \"9ae06196-4040-40db-9dd1-2f4a7c1f616c\") " Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.577887 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae06196-4040-40db-9dd1-2f4a7c1f616c-kube-api-access-tjqx6" (OuterVolumeSpecName: "kube-api-access-tjqx6") pod "9ae06196-4040-40db-9dd1-2f4a7c1f616c" (UID: "9ae06196-4040-40db-9dd1-2f4a7c1f616c"). InnerVolumeSpecName "kube-api-access-tjqx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.580222 4735 scope.go:117] "RemoveContainer" containerID="b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df" Oct 01 10:36:26 crc kubenswrapper[4735]: E1001 10:36:26.580921 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df\": container with ID starting with b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df not found: ID does not exist" containerID="b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.580967 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df"} err="failed to get container status \"b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df\": rpc error: code = NotFound desc = could not find container \"b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df\": container with ID starting with b9889dfa44cc582345a7da54aff09deb72e53dbd37a1870a6bd609b9c69767df not found: ID does not exist" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.580999 4735 scope.go:117] "RemoveContainer" containerID="008f508382ea043136330aa3d2f98a5a30ff05ccd4ebe25d58be351e0d0bbffe" Oct 01 10:36:26 crc kubenswrapper[4735]: E1001 10:36:26.581307 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008f508382ea043136330aa3d2f98a5a30ff05ccd4ebe25d58be351e0d0bbffe\": container with ID starting with 008f508382ea043136330aa3d2f98a5a30ff05ccd4ebe25d58be351e0d0bbffe not found: ID does not exist" containerID="008f508382ea043136330aa3d2f98a5a30ff05ccd4ebe25d58be351e0d0bbffe" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.581394 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008f508382ea043136330aa3d2f98a5a30ff05ccd4ebe25d58be351e0d0bbffe"} err="failed to get container status \"008f508382ea043136330aa3d2f98a5a30ff05ccd4ebe25d58be351e0d0bbffe\": rpc error: code = NotFound desc = could not find container \"008f508382ea043136330aa3d2f98a5a30ff05ccd4ebe25d58be351e0d0bbffe\": container with ID starting with 008f508382ea043136330aa3d2f98a5a30ff05ccd4ebe25d58be351e0d0bbffe not found: ID does not exist" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.635243 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ae06196-4040-40db-9dd1-2f4a7c1f616c" (UID: "9ae06196-4040-40db-9dd1-2f4a7c1f616c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.636068 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9ae06196-4040-40db-9dd1-2f4a7c1f616c" (UID: "9ae06196-4040-40db-9dd1-2f4a7c1f616c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.654586 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-config" (OuterVolumeSpecName: "config") pod "9ae06196-4040-40db-9dd1-2f4a7c1f616c" (UID: "9ae06196-4040-40db-9dd1-2f4a7c1f616c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.656671 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ae06196-4040-40db-9dd1-2f4a7c1f616c" (UID: "9ae06196-4040-40db-9dd1-2f4a7c1f616c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.659267 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ae06196-4040-40db-9dd1-2f4a7c1f616c" (UID: "9ae06196-4040-40db-9dd1-2f4a7c1f616c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.674354 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.674750 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjqx6\" (UniqueName: \"kubernetes.io/projected/9ae06196-4040-40db-9dd1-2f4a7c1f616c-kube-api-access-tjqx6\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.674873 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.674975 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.675048 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.675114 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ae06196-4040-40db-9dd1-2f4a7c1f616c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.865856 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-l6t6p"] Oct 01 10:36:26 crc kubenswrapper[4735]: I1001 10:36:26.874796 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-l6t6p"] Oct 01 10:36:27 crc kubenswrapper[4735]: I1001 10:36:27.546193 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"181bb1ae-a923-4bbd-8d8f-e5d8c8878214","Type":"ContainerStarted","Data":"087eb1667fe3dd9b921d211ab7bf465887dd085b085388dfb23dd671d93bdb9a"} Oct 01 10:36:27 crc kubenswrapper[4735]: I1001 10:36:27.546410 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"181bb1ae-a923-4bbd-8d8f-e5d8c8878214","Type":"ContainerStarted","Data":"d1781c6b73b147260a629c45819fa56b6fff799c222b4cc0fef7a34b73ce6220"} Oct 01 10:36:27 crc kubenswrapper[4735]: I1001 10:36:27.907141 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae06196-4040-40db-9dd1-2f4a7c1f616c" path="/var/lib/kubelet/pods/9ae06196-4040-40db-9dd1-2f4a7c1f616c/volumes" Oct 01 10:36:29 crc kubenswrapper[4735]: I1001 10:36:29.569076 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"181bb1ae-a923-4bbd-8d8f-e5d8c8878214","Type":"ContainerStarted","Data":"fe18805d5ecbace2dfa731b12a4459d4abe52d6dfedd9370a0ef71074154e49b"} Oct 01 10:36:29 crc kubenswrapper[4735]: I1001 10:36:29.569455 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 10:36:29 crc kubenswrapper[4735]: I1001 10:36:29.571743 4735 generic.go:334] "Generic (PLEG): container finished" podID="b35d0ee3-23af-4661-88bc-df962b75ced3" containerID="f5c95e6647646ffa4b3fa47094851620a1358671915e5a25be4b2f27ad7cc190" exitCode=0 Oct 01 10:36:29 crc kubenswrapper[4735]: I1001 10:36:29.571780 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gxllh" event={"ID":"b35d0ee3-23af-4661-88bc-df962b75ced3","Type":"ContainerDied","Data":"f5c95e6647646ffa4b3fa47094851620a1358671915e5a25be4b2f27ad7cc190"} Oct 01 10:36:29 crc kubenswrapper[4735]: I1001 10:36:29.612008 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.391647394 podStartE2EDuration="5.611983467s" podCreationTimestamp="2025-10-01 10:36:24 +0000 UTC" firstStartedPulling="2025-10-01 10:36:25.381722242 +0000 UTC m=+1144.074543504" lastFinishedPulling="2025-10-01 10:36:28.602058275 +0000 UTC m=+1147.294879577" observedRunningTime="2025-10-01 10:36:29.591066645 +0000 UTC m=+1148.283887907" watchObservedRunningTime="2025-10-01 10:36:29.611983467 +0000 UTC m=+1148.304804739" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.024087 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.168402 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-config-data\") pod \"b35d0ee3-23af-4661-88bc-df962b75ced3\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.168626 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-scripts\") pod \"b35d0ee3-23af-4661-88bc-df962b75ced3\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.168681 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-combined-ca-bundle\") pod \"b35d0ee3-23af-4661-88bc-df962b75ced3\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.168817 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvq4n\" (UniqueName: \"kubernetes.io/projected/b35d0ee3-23af-4661-88bc-df962b75ced3-kube-api-access-gvq4n\") pod \"b35d0ee3-23af-4661-88bc-df962b75ced3\" (UID: \"b35d0ee3-23af-4661-88bc-df962b75ced3\") " Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.176837 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-scripts" (OuterVolumeSpecName: "scripts") pod "b35d0ee3-23af-4661-88bc-df962b75ced3" (UID: "b35d0ee3-23af-4661-88bc-df962b75ced3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.177141 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b35d0ee3-23af-4661-88bc-df962b75ced3-kube-api-access-gvq4n" (OuterVolumeSpecName: "kube-api-access-gvq4n") pod "b35d0ee3-23af-4661-88bc-df962b75ced3" (UID: "b35d0ee3-23af-4661-88bc-df962b75ced3"). InnerVolumeSpecName "kube-api-access-gvq4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.214663 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b35d0ee3-23af-4661-88bc-df962b75ced3" (UID: "b35d0ee3-23af-4661-88bc-df962b75ced3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.227877 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-config-data" (OuterVolumeSpecName: "config-data") pod "b35d0ee3-23af-4661-88bc-df962b75ced3" (UID: "b35d0ee3-23af-4661-88bc-df962b75ced3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.272186 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.272225 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvq4n\" (UniqueName: \"kubernetes.io/projected/b35d0ee3-23af-4661-88bc-df962b75ced3-kube-api-access-gvq4n\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.272243 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.272255 4735 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b35d0ee3-23af-4661-88bc-df962b75ced3-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.612235 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gxllh" event={"ID":"b35d0ee3-23af-4661-88bc-df962b75ced3","Type":"ContainerDied","Data":"d6e28d1df187ce380f1d456d556076070040c6b6df16db385c67cfaf1a09de47"} Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.612327 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e28d1df187ce380f1d456d556076070040c6b6df16db385c67cfaf1a09de47" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.612470 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gxllh" Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.830837 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.831369 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9653fd1a-2f70-4b24-9050-b6a32e793e68" containerName="nova-api-log" containerID="cri-o://bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48" gracePeriod=30 Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.831467 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9653fd1a-2f70-4b24-9050-b6a32e793e68" containerName="nova-api-api" containerID="cri-o://6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0" gracePeriod=30 Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.848623 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.848834 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="909500e0-2864-46b8-9b4f-a234e80419f7" containerName="nova-scheduler-scheduler" containerID="cri-o://9e8ffc005f7aff3ca002a6906c52fa45daab77abd816f8b76fc204f8b960fa9a" gracePeriod=30 Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.863198 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.863872 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerName="nova-metadata-log" containerID="cri-o://0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc" gracePeriod=30 Oct 01 10:36:31 crc kubenswrapper[4735]: I1001 10:36:31.864050 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerName="nova-metadata-metadata" containerID="cri-o://1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68" gracePeriod=30 Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.428189 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.596937 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9653fd1a-2f70-4b24-9050-b6a32e793e68-logs\") pod \"9653fd1a-2f70-4b24-9050-b6a32e793e68\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.597117 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-internal-tls-certs\") pod \"9653fd1a-2f70-4b24-9050-b6a32e793e68\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.597174 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-public-tls-certs\") pod \"9653fd1a-2f70-4b24-9050-b6a32e793e68\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.597366 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9653fd1a-2f70-4b24-9050-b6a32e793e68-logs" (OuterVolumeSpecName: "logs") pod "9653fd1a-2f70-4b24-9050-b6a32e793e68" (UID: "9653fd1a-2f70-4b24-9050-b6a32e793e68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.597256 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-config-data\") pod \"9653fd1a-2f70-4b24-9050-b6a32e793e68\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.597985 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb4db\" (UniqueName: \"kubernetes.io/projected/9653fd1a-2f70-4b24-9050-b6a32e793e68-kube-api-access-nb4db\") pod \"9653fd1a-2f70-4b24-9050-b6a32e793e68\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.598021 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-combined-ca-bundle\") pod \"9653fd1a-2f70-4b24-9050-b6a32e793e68\" (UID: \"9653fd1a-2f70-4b24-9050-b6a32e793e68\") " Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.598487 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9653fd1a-2f70-4b24-9050-b6a32e793e68-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.602283 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9653fd1a-2f70-4b24-9050-b6a32e793e68-kube-api-access-nb4db" (OuterVolumeSpecName: "kube-api-access-nb4db") pod "9653fd1a-2f70-4b24-9050-b6a32e793e68" (UID: "9653fd1a-2f70-4b24-9050-b6a32e793e68"). InnerVolumeSpecName "kube-api-access-nb4db". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.624142 4735 generic.go:334] "Generic (PLEG): container finished" podID="9653fd1a-2f70-4b24-9050-b6a32e793e68" containerID="6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0" exitCode=0 Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.624178 4735 generic.go:334] "Generic (PLEG): container finished" podID="9653fd1a-2f70-4b24-9050-b6a32e793e68" containerID="bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48" exitCode=143 Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.624227 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9653fd1a-2f70-4b24-9050-b6a32e793e68","Type":"ContainerDied","Data":"6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0"} Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.624253 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9653fd1a-2f70-4b24-9050-b6a32e793e68","Type":"ContainerDied","Data":"bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48"} Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.624262 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9653fd1a-2f70-4b24-9050-b6a32e793e68","Type":"ContainerDied","Data":"6dcfff02e64225369215252daef389b8e334d6892cf062db3b8be71232bf1167"} Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.624277 4735 scope.go:117] "RemoveContainer" containerID="6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.624424 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.628696 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-config-data" (OuterVolumeSpecName: "config-data") pod "9653fd1a-2f70-4b24-9050-b6a32e793e68" (UID: "9653fd1a-2f70-4b24-9050-b6a32e793e68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.632901 4735 generic.go:334] "Generic (PLEG): container finished" podID="909500e0-2864-46b8-9b4f-a234e80419f7" containerID="9e8ffc005f7aff3ca002a6906c52fa45daab77abd816f8b76fc204f8b960fa9a" exitCode=0 Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.632979 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"909500e0-2864-46b8-9b4f-a234e80419f7","Type":"ContainerDied","Data":"9e8ffc005f7aff3ca002a6906c52fa45daab77abd816f8b76fc204f8b960fa9a"} Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.634913 4735 generic.go:334] "Generic (PLEG): container finished" podID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerID="0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc" exitCode=143 Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.634943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54e82a2b-aaf1-42bc-b424-d570d07b6830","Type":"ContainerDied","Data":"0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc"} Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.645236 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9653fd1a-2f70-4b24-9050-b6a32e793e68" (UID: "9653fd1a-2f70-4b24-9050-b6a32e793e68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.651888 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9653fd1a-2f70-4b24-9050-b6a32e793e68" (UID: "9653fd1a-2f70-4b24-9050-b6a32e793e68"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.665259 4735 scope.go:117] "RemoveContainer" containerID="bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.695068 4735 scope.go:117] "RemoveContainer" containerID="6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.697914 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9653fd1a-2f70-4b24-9050-b6a32e793e68" (UID: "9653fd1a-2f70-4b24-9050-b6a32e793e68"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:32 crc kubenswrapper[4735]: E1001 10:36:32.698065 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0\": container with ID starting with 6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0 not found: ID does not exist" containerID="6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.698106 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0"} err="failed to get container status \"6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0\": rpc error: code = NotFound desc = could not find container \"6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0\": container with ID starting with 6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0 not found: ID does not exist" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.698154 4735 scope.go:117] "RemoveContainer" containerID="bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48" Oct 01 10:36:32 crc kubenswrapper[4735]: E1001 10:36:32.698690 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48\": container with ID starting with bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48 not found: ID does not exist" containerID="bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.698738 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48"} err="failed to get container status \"bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48\": rpc error: code = NotFound desc = could not find container \"bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48\": container with ID starting with bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48 not found: ID does not exist" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.698772 4735 scope.go:117] "RemoveContainer" containerID="6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.699046 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0"} err="failed to get container status \"6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0\": rpc error: code = NotFound desc = could not find container \"6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0\": container with ID starting with 6a9dae1b8fd8f547b9babef21dd58fa6882a5b2aa22524ad6de994b1d0b943a0 not found: ID does not exist" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.699153 4735 scope.go:117] "RemoveContainer" containerID="bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.699942 4735 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.700141 4735 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.700214 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.700312 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb4db\" (UniqueName: \"kubernetes.io/projected/9653fd1a-2f70-4b24-9050-b6a32e793e68-kube-api-access-nb4db\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.700397 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9653fd1a-2f70-4b24-9050-b6a32e793e68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.700092 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48"} err="failed to get container status \"bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48\": rpc error: code = NotFound desc = could not find container \"bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48\": container with ID starting with bd792cfc1dcb4298fc4d2fc3898b3edc30958e3b4d3c0511c5f40308bad2ec48 not found: ID does not exist" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.769823 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.902415 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m54mc\" (UniqueName: \"kubernetes.io/projected/909500e0-2864-46b8-9b4f-a234e80419f7-kube-api-access-m54mc\") pod \"909500e0-2864-46b8-9b4f-a234e80419f7\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.902911 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-combined-ca-bundle\") pod \"909500e0-2864-46b8-9b4f-a234e80419f7\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.903282 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-config-data\") pod \"909500e0-2864-46b8-9b4f-a234e80419f7\" (UID: \"909500e0-2864-46b8-9b4f-a234e80419f7\") " Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.906586 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909500e0-2864-46b8-9b4f-a234e80419f7-kube-api-access-m54mc" (OuterVolumeSpecName: "kube-api-access-m54mc") pod "909500e0-2864-46b8-9b4f-a234e80419f7" (UID: "909500e0-2864-46b8-9b4f-a234e80419f7"). InnerVolumeSpecName "kube-api-access-m54mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.940876 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-config-data" (OuterVolumeSpecName: "config-data") pod "909500e0-2864-46b8-9b4f-a234e80419f7" (UID: "909500e0-2864-46b8-9b4f-a234e80419f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:32 crc kubenswrapper[4735]: I1001 10:36:32.941755 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "909500e0-2864-46b8-9b4f-a234e80419f7" (UID: "909500e0-2864-46b8-9b4f-a234e80419f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.006006 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.006043 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m54mc\" (UniqueName: \"kubernetes.io/projected/909500e0-2864-46b8-9b4f-a234e80419f7-kube-api-access-m54mc\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.006053 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/909500e0-2864-46b8-9b4f-a234e80419f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.058381 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.072888 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.097928 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:33 crc kubenswrapper[4735]: E1001 10:36:33.098621 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909500e0-2864-46b8-9b4f-a234e80419f7" containerName="nova-scheduler-scheduler" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.098665 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="909500e0-2864-46b8-9b4f-a234e80419f7" containerName="nova-scheduler-scheduler" Oct 01 10:36:33 crc kubenswrapper[4735]: E1001 10:36:33.098696 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b35d0ee3-23af-4661-88bc-df962b75ced3" containerName="nova-manage" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.098710 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35d0ee3-23af-4661-88bc-df962b75ced3" containerName="nova-manage" Oct 01 10:36:33 crc kubenswrapper[4735]: E1001 10:36:33.098732 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9653fd1a-2f70-4b24-9050-b6a32e793e68" containerName="nova-api-log" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.098745 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9653fd1a-2f70-4b24-9050-b6a32e793e68" containerName="nova-api-log" Oct 01 10:36:33 crc kubenswrapper[4735]: E1001 10:36:33.098768 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9653fd1a-2f70-4b24-9050-b6a32e793e68" containerName="nova-api-api" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.098781 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9653fd1a-2f70-4b24-9050-b6a32e793e68" containerName="nova-api-api" Oct 01 10:36:33 crc kubenswrapper[4735]: E1001 10:36:33.098841 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae06196-4040-40db-9dd1-2f4a7c1f616c" containerName="dnsmasq-dns" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.098853 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae06196-4040-40db-9dd1-2f4a7c1f616c" containerName="dnsmasq-dns" Oct 01 10:36:33 crc kubenswrapper[4735]: E1001 10:36:33.098872 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae06196-4040-40db-9dd1-2f4a7c1f616c" containerName="init" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.098883 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae06196-4040-40db-9dd1-2f4a7c1f616c" containerName="init" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.099188 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9653fd1a-2f70-4b24-9050-b6a32e793e68" containerName="nova-api-api" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.099223 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae06196-4040-40db-9dd1-2f4a7c1f616c" containerName="dnsmasq-dns" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.099240 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="909500e0-2864-46b8-9b4f-a234e80419f7" containerName="nova-scheduler-scheduler" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.099263 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9653fd1a-2f70-4b24-9050-b6a32e793e68" containerName="nova-api-log" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.099298 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b35d0ee3-23af-4661-88bc-df962b75ced3" containerName="nova-manage" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.101018 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.104506 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.104572 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.104756 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.110206 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.209825 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b96f55-0807-4922-acaf-a84037e549ff-logs\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.209990 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.210079 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-internal-tls-certs\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.210138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwn4\" (UniqueName: \"kubernetes.io/projected/62b96f55-0807-4922-acaf-a84037e549ff-kube-api-access-rrwn4\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.210168 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-config-data\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.210214 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-public-tls-certs\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.311358 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b96f55-0807-4922-acaf-a84037e549ff-logs\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.311427 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.311451 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-internal-tls-certs\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.311483 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwn4\" (UniqueName: \"kubernetes.io/projected/62b96f55-0807-4922-acaf-a84037e549ff-kube-api-access-rrwn4\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.311521 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-config-data\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.311553 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-public-tls-certs\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.312034 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b96f55-0807-4922-acaf-a84037e549ff-logs\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.316675 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.319734 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-public-tls-certs\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.327700 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-config-data\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.328213 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b96f55-0807-4922-acaf-a84037e549ff-internal-tls-certs\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.334481 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwn4\" (UniqueName: \"kubernetes.io/projected/62b96f55-0807-4922-acaf-a84037e549ff-kube-api-access-rrwn4\") pod \"nova-api-0\" (UID: \"62b96f55-0807-4922-acaf-a84037e549ff\") " pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.433724 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.652435 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"909500e0-2864-46b8-9b4f-a234e80419f7","Type":"ContainerDied","Data":"da42ad3161f5990a55504c00a1b68b579533b99c4fce05926188821a2ecee407"} Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.652562 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.652814 4735 scope.go:117] "RemoveContainer" containerID="9e8ffc005f7aff3ca002a6906c52fa45daab77abd816f8b76fc204f8b960fa9a" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.693018 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.718291 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.732555 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.734164 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.735879 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.742102 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.826767 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce10bc0-35a7-49e7-b138-196478a093d0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ce10bc0-35a7-49e7-b138-196478a093d0\") " pod="openstack/nova-scheduler-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.826838 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp8n6\" (UniqueName: \"kubernetes.io/projected/6ce10bc0-35a7-49e7-b138-196478a093d0-kube-api-access-xp8n6\") pod \"nova-scheduler-0\" (UID: \"6ce10bc0-35a7-49e7-b138-196478a093d0\") " pod="openstack/nova-scheduler-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.826874 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce10bc0-35a7-49e7-b138-196478a093d0-config-data\") pod \"nova-scheduler-0\" (UID: \"6ce10bc0-35a7-49e7-b138-196478a093d0\") " pod="openstack/nova-scheduler-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.908010 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909500e0-2864-46b8-9b4f-a234e80419f7" path="/var/lib/kubelet/pods/909500e0-2864-46b8-9b4f-a234e80419f7/volumes" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.908613 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9653fd1a-2f70-4b24-9050-b6a32e793e68" path="/var/lib/kubelet/pods/9653fd1a-2f70-4b24-9050-b6a32e793e68/volumes" Oct 01 10:36:33 crc kubenswrapper[4735]: W1001 10:36:33.909645 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62b96f55_0807_4922_acaf_a84037e549ff.slice/crio-139e25fc8520b9cb2c5f7af47a83049ab9d95a8d8d591b943ca820e89ee33a3c WatchSource:0}: Error finding container 139e25fc8520b9cb2c5f7af47a83049ab9d95a8d8d591b943ca820e89ee33a3c: Status 404 returned error can't find the container with id 139e25fc8520b9cb2c5f7af47a83049ab9d95a8d8d591b943ca820e89ee33a3c Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.909741 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.928523 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce10bc0-35a7-49e7-b138-196478a093d0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ce10bc0-35a7-49e7-b138-196478a093d0\") " pod="openstack/nova-scheduler-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.928614 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp8n6\" (UniqueName: \"kubernetes.io/projected/6ce10bc0-35a7-49e7-b138-196478a093d0-kube-api-access-xp8n6\") pod \"nova-scheduler-0\" (UID: \"6ce10bc0-35a7-49e7-b138-196478a093d0\") " pod="openstack/nova-scheduler-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.928653 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce10bc0-35a7-49e7-b138-196478a093d0-config-data\") pod \"nova-scheduler-0\" (UID: \"6ce10bc0-35a7-49e7-b138-196478a093d0\") " pod="openstack/nova-scheduler-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.937234 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ce10bc0-35a7-49e7-b138-196478a093d0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ce10bc0-35a7-49e7-b138-196478a093d0\") " pod="openstack/nova-scheduler-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.937849 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ce10bc0-35a7-49e7-b138-196478a093d0-config-data\") pod \"nova-scheduler-0\" (UID: \"6ce10bc0-35a7-49e7-b138-196478a093d0\") " pod="openstack/nova-scheduler-0" Oct 01 10:36:33 crc kubenswrapper[4735]: I1001 10:36:33.952886 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp8n6\" (UniqueName: \"kubernetes.io/projected/6ce10bc0-35a7-49e7-b138-196478a093d0-kube-api-access-xp8n6\") pod \"nova-scheduler-0\" (UID: \"6ce10bc0-35a7-49e7-b138-196478a093d0\") " pod="openstack/nova-scheduler-0" Oct 01 10:36:34 crc kubenswrapper[4735]: I1001 10:36:34.054224 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 10:36:34 crc kubenswrapper[4735]: I1001 10:36:34.514601 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 10:36:34 crc kubenswrapper[4735]: W1001 10:36:34.518567 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ce10bc0_35a7_49e7_b138_196478a093d0.slice/crio-bfed276ca94313a52218fda9dbc3f2084e4737e2f9808c8d52cffaa5ad91236a WatchSource:0}: Error finding container bfed276ca94313a52218fda9dbc3f2084e4737e2f9808c8d52cffaa5ad91236a: Status 404 returned error can't find the container with id bfed276ca94313a52218fda9dbc3f2084e4737e2f9808c8d52cffaa5ad91236a Oct 01 10:36:34 crc kubenswrapper[4735]: I1001 10:36:34.663892 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ce10bc0-35a7-49e7-b138-196478a093d0","Type":"ContainerStarted","Data":"bfed276ca94313a52218fda9dbc3f2084e4737e2f9808c8d52cffaa5ad91236a"} Oct 01 10:36:34 crc kubenswrapper[4735]: I1001 10:36:34.667474 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62b96f55-0807-4922-acaf-a84037e549ff","Type":"ContainerStarted","Data":"ee79b31602ea2215527ffc15d6319b8e1accc8b84cebf2d223082b52b8df3ad3"} Oct 01 10:36:34 crc kubenswrapper[4735]: I1001 10:36:34.667568 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62b96f55-0807-4922-acaf-a84037e549ff","Type":"ContainerStarted","Data":"e04a23c8cfa917db5eafdab070f8f9347f89a93ffb4a57aa3e7767abc127c445"} Oct 01 10:36:34 crc kubenswrapper[4735]: I1001 10:36:34.667590 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62b96f55-0807-4922-acaf-a84037e549ff","Type":"ContainerStarted","Data":"139e25fc8520b9cb2c5f7af47a83049ab9d95a8d8d591b943ca820e89ee33a3c"} Oct 01 10:36:34 crc kubenswrapper[4735]: I1001 10:36:34.700410 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.700389264 podStartE2EDuration="1.700389264s" podCreationTimestamp="2025-10-01 10:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:36:34.695276036 +0000 UTC m=+1153.388097308" watchObservedRunningTime="2025-10-01 10:36:34.700389264 +0000 UTC m=+1153.393210526" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.485687 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.486114 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.521050 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.662424 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgqsg\" (UniqueName: \"kubernetes.io/projected/54e82a2b-aaf1-42bc-b424-d570d07b6830-kube-api-access-fgqsg\") pod \"54e82a2b-aaf1-42bc-b424-d570d07b6830\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.662617 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e82a2b-aaf1-42bc-b424-d570d07b6830-logs\") pod \"54e82a2b-aaf1-42bc-b424-d570d07b6830\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.662689 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-nova-metadata-tls-certs\") pod \"54e82a2b-aaf1-42bc-b424-d570d07b6830\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.662763 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-config-data\") pod \"54e82a2b-aaf1-42bc-b424-d570d07b6830\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.662967 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-combined-ca-bundle\") pod \"54e82a2b-aaf1-42bc-b424-d570d07b6830\" (UID: \"54e82a2b-aaf1-42bc-b424-d570d07b6830\") " Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.663166 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e82a2b-aaf1-42bc-b424-d570d07b6830-logs" (OuterVolumeSpecName: "logs") pod "54e82a2b-aaf1-42bc-b424-d570d07b6830" (UID: "54e82a2b-aaf1-42bc-b424-d570d07b6830"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.663634 4735 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e82a2b-aaf1-42bc-b424-d570d07b6830-logs\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.668893 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e82a2b-aaf1-42bc-b424-d570d07b6830-kube-api-access-fgqsg" (OuterVolumeSpecName: "kube-api-access-fgqsg") pod "54e82a2b-aaf1-42bc-b424-d570d07b6830" (UID: "54e82a2b-aaf1-42bc-b424-d570d07b6830"). InnerVolumeSpecName "kube-api-access-fgqsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.683771 4735 generic.go:334] "Generic (PLEG): container finished" podID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerID="1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68" exitCode=0 Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.683856 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.683846 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54e82a2b-aaf1-42bc-b424-d570d07b6830","Type":"ContainerDied","Data":"1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68"} Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.683998 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54e82a2b-aaf1-42bc-b424-d570d07b6830","Type":"ContainerDied","Data":"e0c56682caf5a30646b07ab6955616ec6d035e0a86de2a6660c5bbbc7b1addd6"} Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.684026 4735 scope.go:117] "RemoveContainer" containerID="1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.686378 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ce10bc0-35a7-49e7-b138-196478a093d0","Type":"ContainerStarted","Data":"52309ce9af06e9ad6d135f98e9200274ec0933981b3e0c9510fe6efc13e28337"} Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.700252 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-config-data" (OuterVolumeSpecName: "config-data") pod "54e82a2b-aaf1-42bc-b424-d570d07b6830" (UID: "54e82a2b-aaf1-42bc-b424-d570d07b6830"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.711579 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54e82a2b-aaf1-42bc-b424-d570d07b6830" (UID: "54e82a2b-aaf1-42bc-b424-d570d07b6830"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.715904 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.715878734 podStartE2EDuration="2.715878734s" podCreationTimestamp="2025-10-01 10:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:36:35.704142389 +0000 UTC m=+1154.396963661" watchObservedRunningTime="2025-10-01 10:36:35.715878734 +0000 UTC m=+1154.408700016" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.729904 4735 scope.go:117] "RemoveContainer" containerID="0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.738924 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "54e82a2b-aaf1-42bc-b424-d570d07b6830" (UID: "54e82a2b-aaf1-42bc-b424-d570d07b6830"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.765092 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.765125 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.765136 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgqsg\" (UniqueName: \"kubernetes.io/projected/54e82a2b-aaf1-42bc-b424-d570d07b6830-kube-api-access-fgqsg\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.765145 4735 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e82a2b-aaf1-42bc-b424-d570d07b6830-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.767267 4735 scope.go:117] "RemoveContainer" containerID="1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68" Oct 01 10:36:35 crc kubenswrapper[4735]: E1001 10:36:35.767887 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68\": container with ID starting with 1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68 not found: ID does not exist" containerID="1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.767924 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68"} err="failed to get container status \"1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68\": rpc error: code = NotFound desc = could not find container \"1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68\": container with ID starting with 1b40a5aeb36e78a57ef5af8fb02b5864f5ca81c3e130921ad1ad8722dcd7ab68 not found: ID does not exist" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.767950 4735 scope.go:117] "RemoveContainer" containerID="0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc" Oct 01 10:36:35 crc kubenswrapper[4735]: E1001 10:36:35.768724 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc\": container with ID starting with 0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc not found: ID does not exist" containerID="0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc" Oct 01 10:36:35 crc kubenswrapper[4735]: I1001 10:36:35.768806 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc"} err="failed to get container status \"0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc\": rpc error: code = NotFound desc = could not find container \"0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc\": container with ID starting with 0df881b592fbfad805f3340095e25c2a9ce1db796e45ad65f7abb21e5a0af3cc not found: ID does not exist" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.030904 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.043420 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.055722 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:36:36 crc kubenswrapper[4735]: E1001 10:36:36.056174 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerName="nova-metadata-metadata" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.056191 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerName="nova-metadata-metadata" Oct 01 10:36:36 crc kubenswrapper[4735]: E1001 10:36:36.056234 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerName="nova-metadata-log" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.056242 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerName="nova-metadata-log" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.056405 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerName="nova-metadata-log" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.056424 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e82a2b-aaf1-42bc-b424-d570d07b6830" containerName="nova-metadata-metadata" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.057386 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.059941 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.060392 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.069157 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.174818 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzswg\" (UniqueName: \"kubernetes.io/projected/79b8eab7-e3a4-4194-852d-1f1b91155a7d-kube-api-access-hzswg\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.175171 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b8eab7-e3a4-4194-852d-1f1b91155a7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.175235 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b8eab7-e3a4-4194-852d-1f1b91155a7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.175269 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b8eab7-e3a4-4194-852d-1f1b91155a7d-logs\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.175350 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b8eab7-e3a4-4194-852d-1f1b91155a7d-config-data\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.277469 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b8eab7-e3a4-4194-852d-1f1b91155a7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.277525 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b8eab7-e3a4-4194-852d-1f1b91155a7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.277546 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b8eab7-e3a4-4194-852d-1f1b91155a7d-logs\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.277584 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b8eab7-e3a4-4194-852d-1f1b91155a7d-config-data\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.277676 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzswg\" (UniqueName: \"kubernetes.io/projected/79b8eab7-e3a4-4194-852d-1f1b91155a7d-kube-api-access-hzswg\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.281075 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79b8eab7-e3a4-4194-852d-1f1b91155a7d-logs\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.283369 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79b8eab7-e3a4-4194-852d-1f1b91155a7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.287188 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/79b8eab7-e3a4-4194-852d-1f1b91155a7d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.290845 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79b8eab7-e3a4-4194-852d-1f1b91155a7d-config-data\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.296972 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzswg\" (UniqueName: \"kubernetes.io/projected/79b8eab7-e3a4-4194-852d-1f1b91155a7d-kube-api-access-hzswg\") pod \"nova-metadata-0\" (UID: \"79b8eab7-e3a4-4194-852d-1f1b91155a7d\") " pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.396074 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 10:36:36 crc kubenswrapper[4735]: I1001 10:36:36.852999 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 10:36:36 crc kubenswrapper[4735]: W1001 10:36:36.860889 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79b8eab7_e3a4_4194_852d_1f1b91155a7d.slice/crio-007a22de83ac57f466dc660ad85a6ab791c6ff62f21dfa1c4f1a034150335db8 WatchSource:0}: Error finding container 007a22de83ac57f466dc660ad85a6ab791c6ff62f21dfa1c4f1a034150335db8: Status 404 returned error can't find the container with id 007a22de83ac57f466dc660ad85a6ab791c6ff62f21dfa1c4f1a034150335db8 Oct 01 10:36:37 crc kubenswrapper[4735]: I1001 10:36:37.705956 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79b8eab7-e3a4-4194-852d-1f1b91155a7d","Type":"ContainerStarted","Data":"93c1a016690e557e28c443662eb0b46fe85942a69337bbeb5fd0fd35eaa0479d"} Oct 01 10:36:37 crc kubenswrapper[4735]: I1001 10:36:37.706327 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79b8eab7-e3a4-4194-852d-1f1b91155a7d","Type":"ContainerStarted","Data":"ccd1ed4b9833a3a480996f67f65047f0c386289b97250afa44dadd815e3890a1"} Oct 01 10:36:37 crc kubenswrapper[4735]: I1001 10:36:37.706344 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"79b8eab7-e3a4-4194-852d-1f1b91155a7d","Type":"ContainerStarted","Data":"007a22de83ac57f466dc660ad85a6ab791c6ff62f21dfa1c4f1a034150335db8"} Oct 01 10:36:37 crc kubenswrapper[4735]: I1001 10:36:37.731714 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.7316869879999999 podStartE2EDuration="1.731686988s" podCreationTimestamp="2025-10-01 10:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:36:37.726696774 +0000 UTC m=+1156.419518046" watchObservedRunningTime="2025-10-01 10:36:37.731686988 +0000 UTC m=+1156.424508260" Oct 01 10:36:37 crc kubenswrapper[4735]: I1001 10:36:37.910940 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e82a2b-aaf1-42bc-b424-d570d07b6830" path="/var/lib/kubelet/pods/54e82a2b-aaf1-42bc-b424-d570d07b6830/volumes" Oct 01 10:36:39 crc kubenswrapper[4735]: I1001 10:36:39.054563 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 10:36:41 crc kubenswrapper[4735]: I1001 10:36:41.396425 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 10:36:41 crc kubenswrapper[4735]: I1001 10:36:41.396902 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 10:36:43 crc kubenswrapper[4735]: I1001 10:36:43.434694 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 10:36:43 crc kubenswrapper[4735]: I1001 10:36:43.435242 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 10:36:44 crc kubenswrapper[4735]: I1001 10:36:44.056871 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 10:36:44 crc kubenswrapper[4735]: I1001 10:36:44.108291 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 10:36:44 crc kubenswrapper[4735]: I1001 10:36:44.450644 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="62b96f55-0807-4922-acaf-a84037e549ff" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 10:36:44 crc kubenswrapper[4735]: I1001 10:36:44.450698 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="62b96f55-0807-4922-acaf-a84037e549ff" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 10:36:44 crc kubenswrapper[4735]: I1001 10:36:44.838859 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 10:36:46 crc kubenswrapper[4735]: I1001 10:36:46.397614 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 10:36:46 crc kubenswrapper[4735]: I1001 10:36:46.397687 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 10:36:47 crc kubenswrapper[4735]: I1001 10:36:47.413741 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="79b8eab7-e3a4-4194-852d-1f1b91155a7d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 10:36:47 crc kubenswrapper[4735]: I1001 10:36:47.416558 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="79b8eab7-e3a4-4194-852d-1f1b91155a7d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 10:36:53 crc kubenswrapper[4735]: I1001 10:36:53.447120 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 10:36:53 crc kubenswrapper[4735]: I1001 10:36:53.448629 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 10:36:53 crc kubenswrapper[4735]: I1001 10:36:53.449223 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 10:36:53 crc kubenswrapper[4735]: I1001 10:36:53.449283 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 10:36:53 crc kubenswrapper[4735]: I1001 10:36:53.456889 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 10:36:53 crc kubenswrapper[4735]: I1001 10:36:53.457153 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 10:36:54 crc kubenswrapper[4735]: I1001 10:36:54.948631 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 10:36:56 crc kubenswrapper[4735]: I1001 10:36:56.404555 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 10:36:56 crc kubenswrapper[4735]: I1001 10:36:56.405482 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 10:36:56 crc kubenswrapper[4735]: I1001 10:36:56.424875 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 10:36:56 crc kubenswrapper[4735]: I1001 10:36:56.919633 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 10:37:04 crc kubenswrapper[4735]: I1001 10:37:04.814872 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 10:37:05 crc kubenswrapper[4735]: I1001 10:37:05.485281 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:37:05 crc kubenswrapper[4735]: I1001 10:37:05.485349 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:37:05 crc kubenswrapper[4735]: I1001 10:37:05.685314 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 10:37:09 crc kubenswrapper[4735]: I1001 10:37:09.610728 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f74d6671-f7b0-46ae-91d4-ddb09a530249" containerName="rabbitmq" containerID="cri-o://6948c7ada5c15da2f6d84ce37854bb9e6c999f4068d03ee0404a038e49962128" gracePeriod=604796 Oct 01 10:37:09 crc kubenswrapper[4735]: I1001 10:37:09.658863 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0be1b363-c0e5-4c73-9359-00032a6c8ab9" containerName="rabbitmq" containerID="cri-o://2057b4715aa35c68a82b85bf9b189bc9f2536a5e6a4a8fb0ef7f114940fa42e5" gracePeriod=604797 Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.098222 4735 generic.go:334] "Generic (PLEG): container finished" podID="f74d6671-f7b0-46ae-91d4-ddb09a530249" containerID="6948c7ada5c15da2f6d84ce37854bb9e6c999f4068d03ee0404a038e49962128" exitCode=0 Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.098407 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74d6671-f7b0-46ae-91d4-ddb09a530249","Type":"ContainerDied","Data":"6948c7ada5c15da2f6d84ce37854bb9e6c999f4068d03ee0404a038e49962128"} Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.101387 4735 generic.go:334] "Generic (PLEG): container finished" podID="0be1b363-c0e5-4c73-9359-00032a6c8ab9" containerID="2057b4715aa35c68a82b85bf9b189bc9f2536a5e6a4a8fb0ef7f114940fa42e5" exitCode=0 Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.101423 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0be1b363-c0e5-4c73-9359-00032a6c8ab9","Type":"ContainerDied","Data":"2057b4715aa35c68a82b85bf9b189bc9f2536a5e6a4a8fb0ef7f114940fa42e5"} Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.421445 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.427883 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472085 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-server-conf\") pod \"f74d6671-f7b0-46ae-91d4-ddb09a530249\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472126 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-confd\") pod \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472155 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-erlang-cookie\") pod \"f74d6671-f7b0-46ae-91d4-ddb09a530249\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472202 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgs2x\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-kube-api-access-pgs2x\") pod \"f74d6671-f7b0-46ae-91d4-ddb09a530249\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472232 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74d6671-f7b0-46ae-91d4-ddb09a530249-pod-info\") pod \"f74d6671-f7b0-46ae-91d4-ddb09a530249\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472250 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0be1b363-c0e5-4c73-9359-00032a6c8ab9-pod-info\") pod \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472279 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-plugins\") pod \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472306 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74d6671-f7b0-46ae-91d4-ddb09a530249-erlang-cookie-secret\") pod \"f74d6671-f7b0-46ae-91d4-ddb09a530249\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472350 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-plugins\") pod \"f74d6671-f7b0-46ae-91d4-ddb09a530249\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472377 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f74d6671-f7b0-46ae-91d4-ddb09a530249\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472406 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-plugins-conf\") pod \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472430 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-plugins-conf\") pod \"f74d6671-f7b0-46ae-91d4-ddb09a530249\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472446 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-tls\") pod \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472468 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0be1b363-c0e5-4c73-9359-00032a6c8ab9-erlang-cookie-secret\") pod \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472513 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-config-data\") pod \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472537 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-server-conf\") pod \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472560 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlgjh\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-kube-api-access-tlgjh\") pod \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472589 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472630 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-tls\") pod \"f74d6671-f7b0-46ae-91d4-ddb09a530249\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472645 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-erlang-cookie\") pod \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\" (UID: \"0be1b363-c0e5-4c73-9359-00032a6c8ab9\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472698 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-confd\") pod \"f74d6671-f7b0-46ae-91d4-ddb09a530249\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.472730 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-config-data\") pod \"f74d6671-f7b0-46ae-91d4-ddb09a530249\" (UID: \"f74d6671-f7b0-46ae-91d4-ddb09a530249\") " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.473511 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f74d6671-f7b0-46ae-91d4-ddb09a530249" (UID: "f74d6671-f7b0-46ae-91d4-ddb09a530249"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.473953 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f74d6671-f7b0-46ae-91d4-ddb09a530249" (UID: "f74d6671-f7b0-46ae-91d4-ddb09a530249"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.475223 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0be1b363-c0e5-4c73-9359-00032a6c8ab9" (UID: "0be1b363-c0e5-4c73-9359-00032a6c8ab9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.475446 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0be1b363-c0e5-4c73-9359-00032a6c8ab9" (UID: "0be1b363-c0e5-4c73-9359-00032a6c8ab9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.475691 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0be1b363-c0e5-4c73-9359-00032a6c8ab9" (UID: "0be1b363-c0e5-4c73-9359-00032a6c8ab9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.475862 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f74d6671-f7b0-46ae-91d4-ddb09a530249" (UID: "f74d6671-f7b0-46ae-91d4-ddb09a530249"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.486752 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-kube-api-access-pgs2x" (OuterVolumeSpecName: "kube-api-access-pgs2x") pod "f74d6671-f7b0-46ae-91d4-ddb09a530249" (UID: "f74d6671-f7b0-46ae-91d4-ddb09a530249"). InnerVolumeSpecName "kube-api-access-pgs2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.497095 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f74d6671-f7b0-46ae-91d4-ddb09a530249-pod-info" (OuterVolumeSpecName: "pod-info") pod "f74d6671-f7b0-46ae-91d4-ddb09a530249" (UID: "f74d6671-f7b0-46ae-91d4-ddb09a530249"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.497122 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be1b363-c0e5-4c73-9359-00032a6c8ab9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0be1b363-c0e5-4c73-9359-00032a6c8ab9" (UID: "0be1b363-c0e5-4c73-9359-00032a6c8ab9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.513683 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-kube-api-access-tlgjh" (OuterVolumeSpecName: "kube-api-access-tlgjh") pod "0be1b363-c0e5-4c73-9359-00032a6c8ab9" (UID: "0be1b363-c0e5-4c73-9359-00032a6c8ab9"). InnerVolumeSpecName "kube-api-access-tlgjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.516836 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74d6671-f7b0-46ae-91d4-ddb09a530249-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f74d6671-f7b0-46ae-91d4-ddb09a530249" (UID: "f74d6671-f7b0-46ae-91d4-ddb09a530249"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.521253 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f74d6671-f7b0-46ae-91d4-ddb09a530249" (UID: "f74d6671-f7b0-46ae-91d4-ddb09a530249"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.521371 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0be1b363-c0e5-4c73-9359-00032a6c8ab9" (UID: "0be1b363-c0e5-4c73-9359-00032a6c8ab9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.521485 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0be1b363-c0e5-4c73-9359-00032a6c8ab9-pod-info" (OuterVolumeSpecName: "pod-info") pod "0be1b363-c0e5-4c73-9359-00032a6c8ab9" (UID: "0be1b363-c0e5-4c73-9359-00032a6c8ab9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.531622 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "f74d6671-f7b0-46ae-91d4-ddb09a530249" (UID: "f74d6671-f7b0-46ae-91d4-ddb09a530249"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.533300 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "0be1b363-c0e5-4c73-9359-00032a6c8ab9" (UID: "0be1b363-c0e5-4c73-9359-00032a6c8ab9"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.555149 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-config-data" (OuterVolumeSpecName: "config-data") pod "f74d6671-f7b0-46ae-91d4-ddb09a530249" (UID: "f74d6671-f7b0-46ae-91d4-ddb09a530249"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.562039 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-config-data" (OuterVolumeSpecName: "config-data") pod "0be1b363-c0e5-4c73-9359-00032a6c8ab9" (UID: "0be1b363-c0e5-4c73-9359-00032a6c8ab9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574415 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574446 4735 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0be1b363-c0e5-4c73-9359-00032a6c8ab9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574456 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574468 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlgjh\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-kube-api-access-tlgjh\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574542 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574553 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574562 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574570 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574581 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574590 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgs2x\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-kube-api-access-pgs2x\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574598 4735 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0be1b363-c0e5-4c73-9359-00032a6c8ab9-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574606 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574614 4735 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74d6671-f7b0-46ae-91d4-ddb09a530249-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574622 4735 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74d6671-f7b0-46ae-91d4-ddb09a530249-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574629 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574643 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574651 4735 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.574658 4735 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.590098 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-server-conf" (OuterVolumeSpecName: "server-conf") pod "0be1b363-c0e5-4c73-9359-00032a6c8ab9" (UID: "0be1b363-c0e5-4c73-9359-00032a6c8ab9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.595937 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.599662 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-server-conf" (OuterVolumeSpecName: "server-conf") pod "f74d6671-f7b0-46ae-91d4-ddb09a530249" (UID: "f74d6671-f7b0-46ae-91d4-ddb09a530249"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.604126 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.667833 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0be1b363-c0e5-4c73-9359-00032a6c8ab9" (UID: "0be1b363-c0e5-4c73-9359-00032a6c8ab9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.670306 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f74d6671-f7b0-46ae-91d4-ddb09a530249" (UID: "f74d6671-f7b0-46ae-91d4-ddb09a530249"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.686190 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.686221 4735 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0be1b363-c0e5-4c73-9359-00032a6c8ab9-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.686232 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.686240 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74d6671-f7b0-46ae-91d4-ddb09a530249-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.686249 4735 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74d6671-f7b0-46ae-91d4-ddb09a530249-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:16 crc kubenswrapper[4735]: I1001 10:37:16.686257 4735 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0be1b363-c0e5-4c73-9359-00032a6c8ab9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.112133 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.112177 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0be1b363-c0e5-4c73-9359-00032a6c8ab9","Type":"ContainerDied","Data":"7012fdbfac0f507d9765bd62ef994dcf74c327644ac8db74c521b0908d050531"} Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.113261 4735 scope.go:117] "RemoveContainer" containerID="2057b4715aa35c68a82b85bf9b189bc9f2536a5e6a4a8fb0ef7f114940fa42e5" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.118244 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74d6671-f7b0-46ae-91d4-ddb09a530249","Type":"ContainerDied","Data":"83e3bf19b9306ed77e4db32383597ea8f99d5083fbf0b2ab20cbd087be3b5194"} Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.118331 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.172139 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.172643 4735 scope.go:117] "RemoveContainer" containerID="dd7a7fe9aeac4b441d9f64f6405e80c705fceb5363e7a65d51c92409606fc6d2" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.183638 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.202575 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.208082 4735 scope.go:117] "RemoveContainer" containerID="6948c7ada5c15da2f6d84ce37854bb9e6c999f4068d03ee0404a038e49962128" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.216621 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.232641 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 10:37:17 crc kubenswrapper[4735]: E1001 10:37:17.233147 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be1b363-c0e5-4c73-9359-00032a6c8ab9" containerName="setup-container" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.233169 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be1b363-c0e5-4c73-9359-00032a6c8ab9" containerName="setup-container" Oct 01 10:37:17 crc kubenswrapper[4735]: E1001 10:37:17.233185 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74d6671-f7b0-46ae-91d4-ddb09a530249" containerName="setup-container" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.233193 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74d6671-f7b0-46ae-91d4-ddb09a530249" containerName="setup-container" Oct 01 10:37:17 crc kubenswrapper[4735]: E1001 10:37:17.233213 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be1b363-c0e5-4c73-9359-00032a6c8ab9" containerName="rabbitmq" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.233223 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be1b363-c0e5-4c73-9359-00032a6c8ab9" containerName="rabbitmq" Oct 01 10:37:17 crc kubenswrapper[4735]: E1001 10:37:17.233242 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74d6671-f7b0-46ae-91d4-ddb09a530249" containerName="rabbitmq" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.233250 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74d6671-f7b0-46ae-91d4-ddb09a530249" containerName="rabbitmq" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.233550 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be1b363-c0e5-4c73-9359-00032a6c8ab9" containerName="rabbitmq" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.233587 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74d6671-f7b0-46ae-91d4-ddb09a530249" containerName="rabbitmq" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.251357 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.252012 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.252526 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.252602 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.257402 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.258307 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.258358 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.258631 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.258643 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.258674 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.258744 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.258764 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.258964 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.259030 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.259077 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.259204 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-m6njq" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.259211 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.259285 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.259316 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pl5ft" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.268046 4735 scope.go:117] "RemoveContainer" containerID="99e7412a96e156ecb4e75f45624255a2002f318654941bab406bfe4f4add0f39" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308098 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc51941b-ed03-480a-a90b-ba40dec75a6c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308348 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc51941b-ed03-480a-a90b-ba40dec75a6c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308372 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48a5abfa-1c13-4130-8cad-1596c95ef581-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308394 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308414 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308430 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h52r7\" (UniqueName: \"kubernetes.io/projected/cc51941b-ed03-480a-a90b-ba40dec75a6c-kube-api-access-h52r7\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308446 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308471 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308524 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308542 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308560 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308587 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48a5abfa-1c13-4130-8cad-1596c95ef581-pod-info\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308608 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc51941b-ed03-480a-a90b-ba40dec75a6c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308627 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twkk6\" (UniqueName: \"kubernetes.io/projected/48a5abfa-1c13-4130-8cad-1596c95ef581-kube-api-access-twkk6\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308650 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308666 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308691 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc51941b-ed03-480a-a90b-ba40dec75a6c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308714 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48a5abfa-1c13-4130-8cad-1596c95ef581-server-conf\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308733 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48a5abfa-1c13-4130-8cad-1596c95ef581-config-data\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308753 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48a5abfa-1c13-4130-8cad-1596c95ef581-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308776 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc51941b-ed03-480a-a90b-ba40dec75a6c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.308791 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.410592 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.410638 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.410670 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.410734 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48a5abfa-1c13-4130-8cad-1596c95ef581-pod-info\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.411180 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.411374 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc51941b-ed03-480a-a90b-ba40dec75a6c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.411767 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twkk6\" (UniqueName: \"kubernetes.io/projected/48a5abfa-1c13-4130-8cad-1596c95ef581-kube-api-access-twkk6\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.411808 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.411828 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.411906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc51941b-ed03-480a-a90b-ba40dec75a6c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.411939 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48a5abfa-1c13-4130-8cad-1596c95ef581-server-conf\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.411968 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48a5abfa-1c13-4130-8cad-1596c95ef581-config-data\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.411999 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48a5abfa-1c13-4130-8cad-1596c95ef581-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412025 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc51941b-ed03-480a-a90b-ba40dec75a6c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412041 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412070 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412088 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc51941b-ed03-480a-a90b-ba40dec75a6c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412115 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc51941b-ed03-480a-a90b-ba40dec75a6c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412129 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412136 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48a5abfa-1c13-4130-8cad-1596c95ef581-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412187 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412203 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412226 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h52r7\" (UniqueName: \"kubernetes.io/projected/cc51941b-ed03-480a-a90b-ba40dec75a6c-kube-api-access-h52r7\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412248 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412294 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412713 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.412826 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cc51941b-ed03-480a-a90b-ba40dec75a6c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.413167 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48a5abfa-1c13-4130-8cad-1596c95ef581-config-data\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.413348 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/48a5abfa-1c13-4130-8cad-1596c95ef581-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.413809 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.413897 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.414444 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/48a5abfa-1c13-4130-8cad-1596c95ef581-server-conf\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.415274 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.415281 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc51941b-ed03-480a-a90b-ba40dec75a6c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.415453 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cc51941b-ed03-480a-a90b-ba40dec75a6c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.415736 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/48a5abfa-1c13-4130-8cad-1596c95ef581-pod-info\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.416587 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/48a5abfa-1c13-4130-8cad-1596c95ef581-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.416904 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/48a5abfa-1c13-4130-8cad-1596c95ef581-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.417769 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cc51941b-ed03-480a-a90b-ba40dec75a6c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.418513 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cc51941b-ed03-480a-a90b-ba40dec75a6c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.419282 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.420480 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cc51941b-ed03-480a-a90b-ba40dec75a6c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.430211 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twkk6\" (UniqueName: \"kubernetes.io/projected/48a5abfa-1c13-4130-8cad-1596c95ef581-kube-api-access-twkk6\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.431022 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h52r7\" (UniqueName: \"kubernetes.io/projected/cc51941b-ed03-480a-a90b-ba40dec75a6c-kube-api-access-h52r7\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.464158 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cc51941b-ed03-480a-a90b-ba40dec75a6c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.467410 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"48a5abfa-1c13-4130-8cad-1596c95ef581\") " pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.587082 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.603661 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.909582 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be1b363-c0e5-4c73-9359-00032a6c8ab9" path="/var/lib/kubelet/pods/0be1b363-c0e5-4c73-9359-00032a6c8ab9/volumes" Oct 01 10:37:17 crc kubenswrapper[4735]: I1001 10:37:17.910730 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74d6671-f7b0-46ae-91d4-ddb09a530249" path="/var/lib/kubelet/pods/f74d6671-f7b0-46ae-91d4-ddb09a530249/volumes" Oct 01 10:37:18 crc kubenswrapper[4735]: I1001 10:37:18.026647 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 10:37:18 crc kubenswrapper[4735]: I1001 10:37:18.086722 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 10:37:18 crc kubenswrapper[4735]: W1001 10:37:18.092846 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a5abfa_1c13_4130_8cad_1596c95ef581.slice/crio-767efffb6bdd89d88a49dbecb49373e3b74804f41d240b6e672e4d2cf76bc980 WatchSource:0}: Error finding container 767efffb6bdd89d88a49dbecb49373e3b74804f41d240b6e672e4d2cf76bc980: Status 404 returned error can't find the container with id 767efffb6bdd89d88a49dbecb49373e3b74804f41d240b6e672e4d2cf76bc980 Oct 01 10:37:18 crc kubenswrapper[4735]: I1001 10:37:18.129161 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48a5abfa-1c13-4130-8cad-1596c95ef581","Type":"ContainerStarted","Data":"767efffb6bdd89d88a49dbecb49373e3b74804f41d240b6e672e4d2cf76bc980"} Oct 01 10:37:18 crc kubenswrapper[4735]: I1001 10:37:18.130574 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cc51941b-ed03-480a-a90b-ba40dec75a6c","Type":"ContainerStarted","Data":"8b3114728871bcb7c033614434c613838c7384143f5e6ba7e4969ab37ac540ab"} Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.569505 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mndnk"] Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.571911 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.576564 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.579341 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mndnk"] Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.757272 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-config\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.757630 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.757683 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk9vk\" (UniqueName: \"kubernetes.io/projected/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-kube-api-access-mk9vk\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.757706 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.757760 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-svc\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.758054 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.758141 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.860305 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-config\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.860367 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.860414 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk9vk\" (UniqueName: \"kubernetes.io/projected/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-kube-api-access-mk9vk\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.860443 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.860483 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-svc\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.860599 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.860660 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.861469 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-config\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.861468 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.861590 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.861653 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.861752 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.861774 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-svc\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.878659 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk9vk\" (UniqueName: \"kubernetes.io/projected/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-kube-api-access-mk9vk\") pod \"dnsmasq-dns-67b789f86c-mndnk\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:19 crc kubenswrapper[4735]: I1001 10:37:19.892397 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:20 crc kubenswrapper[4735]: I1001 10:37:20.152380 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mndnk"] Oct 01 10:37:20 crc kubenswrapper[4735]: I1001 10:37:20.157461 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48a5abfa-1c13-4130-8cad-1596c95ef581","Type":"ContainerStarted","Data":"7db9346ba6a9652e7ffa531f642d4ad53e3c1b761051627853a404b9b119fefa"} Oct 01 10:37:20 crc kubenswrapper[4735]: I1001 10:37:20.160408 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cc51941b-ed03-480a-a90b-ba40dec75a6c","Type":"ContainerStarted","Data":"a5c9a87dec2ed1169738343af4f344bf29ef7605d892266faf80a810a7c7109f"} Oct 01 10:37:21 crc kubenswrapper[4735]: I1001 10:37:21.170136 4735 generic.go:334] "Generic (PLEG): container finished" podID="eca3e0e4-a671-4e4d-a818-44dfc701c9b6" containerID="e025595306d8d282d898af5f637a78813c40d98a87b243d96d0a9b59cc83732b" exitCode=0 Oct 01 10:37:21 crc kubenswrapper[4735]: I1001 10:37:21.170250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" event={"ID":"eca3e0e4-a671-4e4d-a818-44dfc701c9b6","Type":"ContainerDied","Data":"e025595306d8d282d898af5f637a78813c40d98a87b243d96d0a9b59cc83732b"} Oct 01 10:37:21 crc kubenswrapper[4735]: I1001 10:37:21.170475 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" event={"ID":"eca3e0e4-a671-4e4d-a818-44dfc701c9b6","Type":"ContainerStarted","Data":"d751ce7c26cbd44a3f9d053bafe6d17799bcbae2256d870c9eadd243ca39b6f8"} Oct 01 10:37:22 crc kubenswrapper[4735]: I1001 10:37:22.180161 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" event={"ID":"eca3e0e4-a671-4e4d-a818-44dfc701c9b6","Type":"ContainerStarted","Data":"b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6"} Oct 01 10:37:22 crc kubenswrapper[4735]: I1001 10:37:22.180674 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:22 crc kubenswrapper[4735]: I1001 10:37:22.197592 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" podStartSLOduration=3.197570392 podStartE2EDuration="3.197570392s" podCreationTimestamp="2025-10-01 10:37:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:37:22.196949105 +0000 UTC m=+1200.889770367" watchObservedRunningTime="2025-10-01 10:37:22.197570392 +0000 UTC m=+1200.890391664" Oct 01 10:37:29 crc kubenswrapper[4735]: I1001 10:37:29.894808 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:29 crc kubenswrapper[4735]: I1001 10:37:29.994589 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-x52mp"] Oct 01 10:37:29 crc kubenswrapper[4735]: I1001 10:37:29.994857 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" podUID="73183ba7-fe81-43ee-b62d-e843f83406c3" containerName="dnsmasq-dns" containerID="cri-o://d292d2caf6866c62adfa1b621927f50848b1212cedc33c7917ac6db36be8e688" gracePeriod=10 Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.127621 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-4vfgs"] Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.129021 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.144291 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-4vfgs"] Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.267455 4735 generic.go:334] "Generic (PLEG): container finished" podID="73183ba7-fe81-43ee-b62d-e843f83406c3" containerID="d292d2caf6866c62adfa1b621927f50848b1212cedc33c7917ac6db36be8e688" exitCode=0 Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.267744 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" event={"ID":"73183ba7-fe81-43ee-b62d-e843f83406c3","Type":"ContainerDied","Data":"d292d2caf6866c62adfa1b621927f50848b1212cedc33c7917ac6db36be8e688"} Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.272755 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.272804 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-config\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.272852 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.272881 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.272923 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.272940 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5pbg\" (UniqueName: \"kubernetes.io/projected/9b994d24-224b-42cf-8516-044c561a5f4e-kube-api-access-q5pbg\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.272976 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.374236 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.374293 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.374315 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5pbg\" (UniqueName: \"kubernetes.io/projected/9b994d24-224b-42cf-8516-044c561a5f4e-kube-api-access-q5pbg\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.374352 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.374400 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.374431 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-config\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.374472 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.375212 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.375568 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.375583 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.375607 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.375645 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.376637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b994d24-224b-42cf-8516-044c561a5f4e-config\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.393634 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5pbg\" (UniqueName: \"kubernetes.io/projected/9b994d24-224b-42cf-8516-044c561a5f4e-kube-api-access-q5pbg\") pod \"dnsmasq-dns-cb6ffcf87-4vfgs\" (UID: \"9b994d24-224b-42cf-8516-044c561a5f4e\") " pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.454527 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.479833 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.576855 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-svc\") pod \"73183ba7-fe81-43ee-b62d-e843f83406c3\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.577331 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqj5f\" (UniqueName: \"kubernetes.io/projected/73183ba7-fe81-43ee-b62d-e843f83406c3-kube-api-access-lqj5f\") pod \"73183ba7-fe81-43ee-b62d-e843f83406c3\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.577363 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-config\") pod \"73183ba7-fe81-43ee-b62d-e843f83406c3\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.577985 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-sb\") pod \"73183ba7-fe81-43ee-b62d-e843f83406c3\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.578028 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-nb\") pod \"73183ba7-fe81-43ee-b62d-e843f83406c3\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.578160 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-swift-storage-0\") pod \"73183ba7-fe81-43ee-b62d-e843f83406c3\" (UID: \"73183ba7-fe81-43ee-b62d-e843f83406c3\") " Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.583006 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73183ba7-fe81-43ee-b62d-e843f83406c3-kube-api-access-lqj5f" (OuterVolumeSpecName: "kube-api-access-lqj5f") pod "73183ba7-fe81-43ee-b62d-e843f83406c3" (UID: "73183ba7-fe81-43ee-b62d-e843f83406c3"). InnerVolumeSpecName "kube-api-access-lqj5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.625033 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73183ba7-fe81-43ee-b62d-e843f83406c3" (UID: "73183ba7-fe81-43ee-b62d-e843f83406c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.632325 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "73183ba7-fe81-43ee-b62d-e843f83406c3" (UID: "73183ba7-fe81-43ee-b62d-e843f83406c3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.636822 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "73183ba7-fe81-43ee-b62d-e843f83406c3" (UID: "73183ba7-fe81-43ee-b62d-e843f83406c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.644051 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73183ba7-fe81-43ee-b62d-e843f83406c3" (UID: "73183ba7-fe81-43ee-b62d-e843f83406c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.644102 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-config" (OuterVolumeSpecName: "config") pod "73183ba7-fe81-43ee-b62d-e843f83406c3" (UID: "73183ba7-fe81-43ee-b62d-e843f83406c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.680822 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqj5f\" (UniqueName: \"kubernetes.io/projected/73183ba7-fe81-43ee-b62d-e843f83406c3-kube-api-access-lqj5f\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.680847 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.680856 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.680864 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.680872 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.680880 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73183ba7-fe81-43ee-b62d-e843f83406c3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:30 crc kubenswrapper[4735]: I1001 10:37:30.921242 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-4vfgs"] Oct 01 10:37:30 crc kubenswrapper[4735]: W1001 10:37:30.926022 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b994d24_224b_42cf_8516_044c561a5f4e.slice/crio-720fe181766855b0a7285c8d288a34598017e5b21255ebab7a1a4219861dff52 WatchSource:0}: Error finding container 720fe181766855b0a7285c8d288a34598017e5b21255ebab7a1a4219861dff52: Status 404 returned error can't find the container with id 720fe181766855b0a7285c8d288a34598017e5b21255ebab7a1a4219861dff52 Oct 01 10:37:31 crc kubenswrapper[4735]: I1001 10:37:31.288947 4735 generic.go:334] "Generic (PLEG): container finished" podID="9b994d24-224b-42cf-8516-044c561a5f4e" containerID="df0d50b5c8944a385e6ddcd0c91f1ed6cc24a611c461068fd5170024636968bf" exitCode=0 Oct 01 10:37:31 crc kubenswrapper[4735]: I1001 10:37:31.289050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" event={"ID":"9b994d24-224b-42cf-8516-044c561a5f4e","Type":"ContainerDied","Data":"df0d50b5c8944a385e6ddcd0c91f1ed6cc24a611c461068fd5170024636968bf"} Oct 01 10:37:31 crc kubenswrapper[4735]: I1001 10:37:31.289280 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" event={"ID":"9b994d24-224b-42cf-8516-044c561a5f4e","Type":"ContainerStarted","Data":"720fe181766855b0a7285c8d288a34598017e5b21255ebab7a1a4219861dff52"} Oct 01 10:37:31 crc kubenswrapper[4735]: I1001 10:37:31.296105 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" event={"ID":"73183ba7-fe81-43ee-b62d-e843f83406c3","Type":"ContainerDied","Data":"55a81045279116e784ec8701c95fc5cf5da975607170253621762dbf3fd5af7e"} Oct 01 10:37:31 crc kubenswrapper[4735]: I1001 10:37:31.296150 4735 scope.go:117] "RemoveContainer" containerID="d292d2caf6866c62adfa1b621927f50848b1212cedc33c7917ac6db36be8e688" Oct 01 10:37:31 crc kubenswrapper[4735]: I1001 10:37:31.296278 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-x52mp" Oct 01 10:37:31 crc kubenswrapper[4735]: I1001 10:37:31.414816 4735 scope.go:117] "RemoveContainer" containerID="8dee2661d98385c1217474cf803c0be7616076fbbe56fd12f2e678f22ed21359" Oct 01 10:37:31 crc kubenswrapper[4735]: I1001 10:37:31.509638 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-x52mp"] Oct 01 10:37:31 crc kubenswrapper[4735]: I1001 10:37:31.516684 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-x52mp"] Oct 01 10:37:31 crc kubenswrapper[4735]: I1001 10:37:31.912049 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73183ba7-fe81-43ee-b62d-e843f83406c3" path="/var/lib/kubelet/pods/73183ba7-fe81-43ee-b62d-e843f83406c3/volumes" Oct 01 10:37:32 crc kubenswrapper[4735]: I1001 10:37:32.310003 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" event={"ID":"9b994d24-224b-42cf-8516-044c561a5f4e","Type":"ContainerStarted","Data":"20eb9d1ae7c3e9959161d1255b3ad30ddcd856e5c494079f450183f90467655f"} Oct 01 10:37:32 crc kubenswrapper[4735]: I1001 10:37:32.310196 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:32 crc kubenswrapper[4735]: I1001 10:37:32.332170 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" podStartSLOduration=2.332149452 podStartE2EDuration="2.332149452s" podCreationTimestamp="2025-10-01 10:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:37:32.326566672 +0000 UTC m=+1211.019387934" watchObservedRunningTime="2025-10-01 10:37:32.332149452 +0000 UTC m=+1211.024970714" Oct 01 10:37:35 crc kubenswrapper[4735]: I1001 10:37:35.485979 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:37:35 crc kubenswrapper[4735]: I1001 10:37:35.486406 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:37:35 crc kubenswrapper[4735]: I1001 10:37:35.486455 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:37:35 crc kubenswrapper[4735]: I1001 10:37:35.487047 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6042cbc8542ef3e83bd7d2006832c7a2d565a12a6c7538fc83d415163087f591"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 10:37:35 crc kubenswrapper[4735]: I1001 10:37:35.487113 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://6042cbc8542ef3e83bd7d2006832c7a2d565a12a6c7538fc83d415163087f591" gracePeriod=600 Oct 01 10:37:36 crc kubenswrapper[4735]: I1001 10:37:36.348744 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="6042cbc8542ef3e83bd7d2006832c7a2d565a12a6c7538fc83d415163087f591" exitCode=0 Oct 01 10:37:36 crc kubenswrapper[4735]: I1001 10:37:36.349298 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"6042cbc8542ef3e83bd7d2006832c7a2d565a12a6c7538fc83d415163087f591"} Oct 01 10:37:36 crc kubenswrapper[4735]: I1001 10:37:36.349325 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"32fd47d6848f87a0586764661e897b70eae54a447c3515a2ff59f02148ba9b6a"} Oct 01 10:37:36 crc kubenswrapper[4735]: I1001 10:37:36.349341 4735 scope.go:117] "RemoveContainer" containerID="1d509f3e1d9829219adbe6f0a296874023b5cdfe25a87df90afeebbd5d68c288" Oct 01 10:37:40 crc kubenswrapper[4735]: I1001 10:37:40.482047 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-4vfgs" Oct 01 10:37:40 crc kubenswrapper[4735]: I1001 10:37:40.561021 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mndnk"] Oct 01 10:37:40 crc kubenswrapper[4735]: I1001 10:37:40.561373 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" podUID="eca3e0e4-a671-4e4d-a818-44dfc701c9b6" containerName="dnsmasq-dns" containerID="cri-o://b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6" gracePeriod=10 Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.026349 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.178385 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-nb\") pod \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.178470 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-openstack-edpm-ipam\") pod \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.178580 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-config\") pod \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.178649 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-svc\") pod \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.178694 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk9vk\" (UniqueName: \"kubernetes.io/projected/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-kube-api-access-mk9vk\") pod \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.178808 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-sb\") pod \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.178864 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-swift-storage-0\") pod \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\" (UID: \"eca3e0e4-a671-4e4d-a818-44dfc701c9b6\") " Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.185013 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-kube-api-access-mk9vk" (OuterVolumeSpecName: "kube-api-access-mk9vk") pod "eca3e0e4-a671-4e4d-a818-44dfc701c9b6" (UID: "eca3e0e4-a671-4e4d-a818-44dfc701c9b6"). InnerVolumeSpecName "kube-api-access-mk9vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.234123 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "eca3e0e4-a671-4e4d-a818-44dfc701c9b6" (UID: "eca3e0e4-a671-4e4d-a818-44dfc701c9b6"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.240811 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-config" (OuterVolumeSpecName: "config") pod "eca3e0e4-a671-4e4d-a818-44dfc701c9b6" (UID: "eca3e0e4-a671-4e4d-a818-44dfc701c9b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.242519 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eca3e0e4-a671-4e4d-a818-44dfc701c9b6" (UID: "eca3e0e4-a671-4e4d-a818-44dfc701c9b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.243703 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eca3e0e4-a671-4e4d-a818-44dfc701c9b6" (UID: "eca3e0e4-a671-4e4d-a818-44dfc701c9b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.245595 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eca3e0e4-a671-4e4d-a818-44dfc701c9b6" (UID: "eca3e0e4-a671-4e4d-a818-44dfc701c9b6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.255031 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eca3e0e4-a671-4e4d-a818-44dfc701c9b6" (UID: "eca3e0e4-a671-4e4d-a818-44dfc701c9b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.281982 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.282010 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.282023 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-config\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.282032 4735 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.282040 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk9vk\" (UniqueName: \"kubernetes.io/projected/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-kube-api-access-mk9vk\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.282050 4735 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.282058 4735 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca3e0e4-a671-4e4d-a818-44dfc701c9b6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.399904 4735 generic.go:334] "Generic (PLEG): container finished" podID="eca3e0e4-a671-4e4d-a818-44dfc701c9b6" containerID="b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6" exitCode=0 Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.400026 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.400045 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" event={"ID":"eca3e0e4-a671-4e4d-a818-44dfc701c9b6","Type":"ContainerDied","Data":"b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6"} Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.400316 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-mndnk" event={"ID":"eca3e0e4-a671-4e4d-a818-44dfc701c9b6","Type":"ContainerDied","Data":"d751ce7c26cbd44a3f9d053bafe6d17799bcbae2256d870c9eadd243ca39b6f8"} Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.400337 4735 scope.go:117] "RemoveContainer" containerID="b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.423673 4735 scope.go:117] "RemoveContainer" containerID="e025595306d8d282d898af5f637a78813c40d98a87b243d96d0a9b59cc83732b" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.431094 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mndnk"] Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.438034 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-mndnk"] Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.442286 4735 scope.go:117] "RemoveContainer" containerID="b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6" Oct 01 10:37:41 crc kubenswrapper[4735]: E1001 10:37:41.442749 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6\": container with ID starting with b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6 not found: ID does not exist" containerID="b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.442781 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6"} err="failed to get container status \"b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6\": rpc error: code = NotFound desc = could not find container \"b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6\": container with ID starting with b904906e821b1aa945d61f4d524e9731e346082610f3eeeb43be6b7b8d74f3d6 not found: ID does not exist" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.442807 4735 scope.go:117] "RemoveContainer" containerID="e025595306d8d282d898af5f637a78813c40d98a87b243d96d0a9b59cc83732b" Oct 01 10:37:41 crc kubenswrapper[4735]: E1001 10:37:41.443022 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e025595306d8d282d898af5f637a78813c40d98a87b243d96d0a9b59cc83732b\": container with ID starting with e025595306d8d282d898af5f637a78813c40d98a87b243d96d0a9b59cc83732b not found: ID does not exist" containerID="e025595306d8d282d898af5f637a78813c40d98a87b243d96d0a9b59cc83732b" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.443043 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e025595306d8d282d898af5f637a78813c40d98a87b243d96d0a9b59cc83732b"} err="failed to get container status \"e025595306d8d282d898af5f637a78813c40d98a87b243d96d0a9b59cc83732b\": rpc error: code = NotFound desc = could not find container \"e025595306d8d282d898af5f637a78813c40d98a87b243d96d0a9b59cc83732b\": container with ID starting with e025595306d8d282d898af5f637a78813c40d98a87b243d96d0a9b59cc83732b not found: ID does not exist" Oct 01 10:37:41 crc kubenswrapper[4735]: I1001 10:37:41.913729 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca3e0e4-a671-4e4d-a818-44dfc701c9b6" path="/var/lib/kubelet/pods/eca3e0e4-a671-4e4d-a818-44dfc701c9b6/volumes" Oct 01 10:37:52 crc kubenswrapper[4735]: I1001 10:37:52.514471 4735 generic.go:334] "Generic (PLEG): container finished" podID="cc51941b-ed03-480a-a90b-ba40dec75a6c" containerID="a5c9a87dec2ed1169738343af4f344bf29ef7605d892266faf80a810a7c7109f" exitCode=0 Oct 01 10:37:52 crc kubenswrapper[4735]: I1001 10:37:52.514559 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cc51941b-ed03-480a-a90b-ba40dec75a6c","Type":"ContainerDied","Data":"a5c9a87dec2ed1169738343af4f344bf29ef7605d892266faf80a810a7c7109f"} Oct 01 10:37:52 crc kubenswrapper[4735]: I1001 10:37:52.518362 4735 generic.go:334] "Generic (PLEG): container finished" podID="48a5abfa-1c13-4130-8cad-1596c95ef581" containerID="7db9346ba6a9652e7ffa531f642d4ad53e3c1b761051627853a404b9b119fefa" exitCode=0 Oct 01 10:37:52 crc kubenswrapper[4735]: I1001 10:37:52.518392 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48a5abfa-1c13-4130-8cad-1596c95ef581","Type":"ContainerDied","Data":"7db9346ba6a9652e7ffa531f642d4ad53e3c1b761051627853a404b9b119fefa"} Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.532136 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cc51941b-ed03-480a-a90b-ba40dec75a6c","Type":"ContainerStarted","Data":"b985b68f644defe75a04654db6518d442a7a715967016b83b653858e8c7eb75e"} Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.532995 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.535244 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"48a5abfa-1c13-4130-8cad-1596c95ef581","Type":"ContainerStarted","Data":"6f059b08cd4b5ae795f1011f36523028f8cd7fb32f71887af0ed8d30148349b6"} Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.535780 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.551890 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs"] Oct 01 10:37:53 crc kubenswrapper[4735]: E1001 10:37:53.552453 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca3e0e4-a671-4e4d-a818-44dfc701c9b6" containerName="init" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.552473 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca3e0e4-a671-4e4d-a818-44dfc701c9b6" containerName="init" Oct 01 10:37:53 crc kubenswrapper[4735]: E1001 10:37:53.552492 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca3e0e4-a671-4e4d-a818-44dfc701c9b6" containerName="dnsmasq-dns" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.552517 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca3e0e4-a671-4e4d-a818-44dfc701c9b6" containerName="dnsmasq-dns" Oct 01 10:37:53 crc kubenswrapper[4735]: E1001 10:37:53.552552 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73183ba7-fe81-43ee-b62d-e843f83406c3" containerName="dnsmasq-dns" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.552561 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="73183ba7-fe81-43ee-b62d-e843f83406c3" containerName="dnsmasq-dns" Oct 01 10:37:53 crc kubenswrapper[4735]: E1001 10:37:53.552573 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73183ba7-fe81-43ee-b62d-e843f83406c3" containerName="init" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.552592 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="73183ba7-fe81-43ee-b62d-e843f83406c3" containerName="init" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.552856 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="73183ba7-fe81-43ee-b62d-e843f83406c3" containerName="dnsmasq-dns" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.552872 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca3e0e4-a671-4e4d-a818-44dfc701c9b6" containerName="dnsmasq-dns" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.553647 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.556384 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.557583 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.557845 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.558919 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.564274 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs"] Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.570294 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.570274756 podStartE2EDuration="36.570274756s" podCreationTimestamp="2025-10-01 10:37:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:37:53.564612234 +0000 UTC m=+1232.257433546" watchObservedRunningTime="2025-10-01 10:37:53.570274756 +0000 UTC m=+1232.263096028" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.616636 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.616612771 podStartE2EDuration="36.616612771s" podCreationTimestamp="2025-10-01 10:37:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 10:37:53.608050101 +0000 UTC m=+1232.300871373" watchObservedRunningTime="2025-10-01 10:37:53.616612771 +0000 UTC m=+1232.309434043" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.619644 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.619731 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.619864 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d68lb\" (UniqueName: \"kubernetes.io/projected/7847fa7b-0680-48a0-bbba-2adf6b14fcec-kube-api-access-d68lb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.621216 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.723293 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.723351 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.723384 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.723459 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d68lb\" (UniqueName: \"kubernetes.io/projected/7847fa7b-0680-48a0-bbba-2adf6b14fcec-kube-api-access-d68lb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.730332 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.731071 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.733373 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.755739 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d68lb\" (UniqueName: \"kubernetes.io/projected/7847fa7b-0680-48a0-bbba-2adf6b14fcec-kube-api-access-d68lb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:53 crc kubenswrapper[4735]: I1001 10:37:53.892714 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:37:54 crc kubenswrapper[4735]: I1001 10:37:54.486340 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs"] Oct 01 10:37:54 crc kubenswrapper[4735]: W1001 10:37:54.487682 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7847fa7b_0680_48a0_bbba_2adf6b14fcec.slice/crio-716fe0a389c082d3531219495603aa1feaf1656fa07278011484fe25844e7f6b WatchSource:0}: Error finding container 716fe0a389c082d3531219495603aa1feaf1656fa07278011484fe25844e7f6b: Status 404 returned error can't find the container with id 716fe0a389c082d3531219495603aa1feaf1656fa07278011484fe25844e7f6b Oct 01 10:37:54 crc kubenswrapper[4735]: I1001 10:37:54.489690 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 10:37:54 crc kubenswrapper[4735]: I1001 10:37:54.546954 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" event={"ID":"7847fa7b-0680-48a0-bbba-2adf6b14fcec","Type":"ContainerStarted","Data":"716fe0a389c082d3531219495603aa1feaf1656fa07278011484fe25844e7f6b"} Oct 01 10:38:04 crc kubenswrapper[4735]: I1001 10:38:04.656644 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" event={"ID":"7847fa7b-0680-48a0-bbba-2adf6b14fcec","Type":"ContainerStarted","Data":"f21f69580abe917de8d6b69629da5ad4aab71183b22062b36191781f7bc617ba"} Oct 01 10:38:04 crc kubenswrapper[4735]: I1001 10:38:04.690177 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" podStartSLOduration=2.079028868 podStartE2EDuration="11.690151916s" podCreationTimestamp="2025-10-01 10:37:53 +0000 UTC" firstStartedPulling="2025-10-01 10:37:54.489424368 +0000 UTC m=+1233.182245630" lastFinishedPulling="2025-10-01 10:38:04.100547406 +0000 UTC m=+1242.793368678" observedRunningTime="2025-10-01 10:38:04.675329528 +0000 UTC m=+1243.368150830" watchObservedRunningTime="2025-10-01 10:38:04.690151916 +0000 UTC m=+1243.382973218" Oct 01 10:38:07 crc kubenswrapper[4735]: I1001 10:38:07.593776 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 10:38:07 crc kubenswrapper[4735]: I1001 10:38:07.608744 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 10:38:16 crc kubenswrapper[4735]: I1001 10:38:16.803902 4735 generic.go:334] "Generic (PLEG): container finished" podID="7847fa7b-0680-48a0-bbba-2adf6b14fcec" containerID="f21f69580abe917de8d6b69629da5ad4aab71183b22062b36191781f7bc617ba" exitCode=0 Oct 01 10:38:16 crc kubenswrapper[4735]: I1001 10:38:16.804059 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" event={"ID":"7847fa7b-0680-48a0-bbba-2adf6b14fcec","Type":"ContainerDied","Data":"f21f69580abe917de8d6b69629da5ad4aab71183b22062b36191781f7bc617ba"} Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.267728 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.410399 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-ssh-key\") pod \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.410435 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-inventory\") pod \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.410567 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-repo-setup-combined-ca-bundle\") pod \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.410676 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d68lb\" (UniqueName: \"kubernetes.io/projected/7847fa7b-0680-48a0-bbba-2adf6b14fcec-kube-api-access-d68lb\") pod \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\" (UID: \"7847fa7b-0680-48a0-bbba-2adf6b14fcec\") " Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.417318 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7847fa7b-0680-48a0-bbba-2adf6b14fcec" (UID: "7847fa7b-0680-48a0-bbba-2adf6b14fcec"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.417431 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7847fa7b-0680-48a0-bbba-2adf6b14fcec-kube-api-access-d68lb" (OuterVolumeSpecName: "kube-api-access-d68lb") pod "7847fa7b-0680-48a0-bbba-2adf6b14fcec" (UID: "7847fa7b-0680-48a0-bbba-2adf6b14fcec"). InnerVolumeSpecName "kube-api-access-d68lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.440585 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-inventory" (OuterVolumeSpecName: "inventory") pod "7847fa7b-0680-48a0-bbba-2adf6b14fcec" (UID: "7847fa7b-0680-48a0-bbba-2adf6b14fcec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.443201 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7847fa7b-0680-48a0-bbba-2adf6b14fcec" (UID: "7847fa7b-0680-48a0-bbba-2adf6b14fcec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.513553 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.513585 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.513599 4735 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7847fa7b-0680-48a0-bbba-2adf6b14fcec-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.513615 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d68lb\" (UniqueName: \"kubernetes.io/projected/7847fa7b-0680-48a0-bbba-2adf6b14fcec-kube-api-access-d68lb\") on node \"crc\" DevicePath \"\"" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.835250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" event={"ID":"7847fa7b-0680-48a0-bbba-2adf6b14fcec","Type":"ContainerDied","Data":"716fe0a389c082d3531219495603aa1feaf1656fa07278011484fe25844e7f6b"} Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.835279 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.835295 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="716fe0a389c082d3531219495603aa1feaf1656fa07278011484fe25844e7f6b" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.937820 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh"] Oct 01 10:38:19 crc kubenswrapper[4735]: E1001 10:38:18.938459 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7847fa7b-0680-48a0-bbba-2adf6b14fcec" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.938482 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7847fa7b-0680-48a0-bbba-2adf6b14fcec" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.938876 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7847fa7b-0680-48a0-bbba-2adf6b14fcec" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.939741 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.942746 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.942782 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.942788 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.944774 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:18.950762 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh"] Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.127609 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ldfrh\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.127699 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ldfrh\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.128048 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmgvl\" (UniqueName: \"kubernetes.io/projected/f3b0231f-c3ff-46dd-869c-c36d62466f45-kube-api-access-zmgvl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ldfrh\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.231184 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ldfrh\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.231241 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ldfrh\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.231306 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgvl\" (UniqueName: \"kubernetes.io/projected/f3b0231f-c3ff-46dd-869c-c36d62466f45-kube-api-access-zmgvl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ldfrh\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.236973 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ldfrh\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.237813 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ldfrh\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.262004 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmgvl\" (UniqueName: \"kubernetes.io/projected/f3b0231f-c3ff-46dd-869c-c36d62466f45-kube-api-access-zmgvl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ldfrh\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.266644 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.828269 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh"] Oct 01 10:38:19 crc kubenswrapper[4735]: W1001 10:38:19.833214 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3b0231f_c3ff_46dd_869c_c36d62466f45.slice/crio-0ea5f90f3591f96e87616caef133f5fe35988afa5047d2788ddc568cec2e6d6f WatchSource:0}: Error finding container 0ea5f90f3591f96e87616caef133f5fe35988afa5047d2788ddc568cec2e6d6f: Status 404 returned error can't find the container with id 0ea5f90f3591f96e87616caef133f5fe35988afa5047d2788ddc568cec2e6d6f Oct 01 10:38:19 crc kubenswrapper[4735]: I1001 10:38:19.845846 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" event={"ID":"f3b0231f-c3ff-46dd-869c-c36d62466f45","Type":"ContainerStarted","Data":"0ea5f90f3591f96e87616caef133f5fe35988afa5047d2788ddc568cec2e6d6f"} Oct 01 10:38:20 crc kubenswrapper[4735]: I1001 10:38:20.855531 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" event={"ID":"f3b0231f-c3ff-46dd-869c-c36d62466f45","Type":"ContainerStarted","Data":"c79c6803bb617d862bee8a693227626af176e28e7071b0175d3d425ec2cd11de"} Oct 01 10:38:20 crc kubenswrapper[4735]: I1001 10:38:20.870804 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" podStartSLOduration=2.304859449 podStartE2EDuration="2.870781623s" podCreationTimestamp="2025-10-01 10:38:18 +0000 UTC" firstStartedPulling="2025-10-01 10:38:19.836919659 +0000 UTC m=+1258.529740931" lastFinishedPulling="2025-10-01 10:38:20.402841843 +0000 UTC m=+1259.095663105" observedRunningTime="2025-10-01 10:38:20.868551073 +0000 UTC m=+1259.561372345" watchObservedRunningTime="2025-10-01 10:38:20.870781623 +0000 UTC m=+1259.563602905" Oct 01 10:38:23 crc kubenswrapper[4735]: I1001 10:38:23.892385 4735 generic.go:334] "Generic (PLEG): container finished" podID="f3b0231f-c3ff-46dd-869c-c36d62466f45" containerID="c79c6803bb617d862bee8a693227626af176e28e7071b0175d3d425ec2cd11de" exitCode=0 Oct 01 10:38:23 crc kubenswrapper[4735]: I1001 10:38:23.892549 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" event={"ID":"f3b0231f-c3ff-46dd-869c-c36d62466f45","Type":"ContainerDied","Data":"c79c6803bb617d862bee8a693227626af176e28e7071b0175d3d425ec2cd11de"} Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.335714 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.466542 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-inventory\") pod \"f3b0231f-c3ff-46dd-869c-c36d62466f45\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.466737 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmgvl\" (UniqueName: \"kubernetes.io/projected/f3b0231f-c3ff-46dd-869c-c36d62466f45-kube-api-access-zmgvl\") pod \"f3b0231f-c3ff-46dd-869c-c36d62466f45\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.466891 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-ssh-key\") pod \"f3b0231f-c3ff-46dd-869c-c36d62466f45\" (UID: \"f3b0231f-c3ff-46dd-869c-c36d62466f45\") " Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.473304 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b0231f-c3ff-46dd-869c-c36d62466f45-kube-api-access-zmgvl" (OuterVolumeSpecName: "kube-api-access-zmgvl") pod "f3b0231f-c3ff-46dd-869c-c36d62466f45" (UID: "f3b0231f-c3ff-46dd-869c-c36d62466f45"). InnerVolumeSpecName "kube-api-access-zmgvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.498970 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f3b0231f-c3ff-46dd-869c-c36d62466f45" (UID: "f3b0231f-c3ff-46dd-869c-c36d62466f45"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.510693 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-inventory" (OuterVolumeSpecName: "inventory") pod "f3b0231f-c3ff-46dd-869c-c36d62466f45" (UID: "f3b0231f-c3ff-46dd-869c-c36d62466f45"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.569087 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.569122 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmgvl\" (UniqueName: \"kubernetes.io/projected/f3b0231f-c3ff-46dd-869c-c36d62466f45-kube-api-access-zmgvl\") on node \"crc\" DevicePath \"\"" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.569134 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3b0231f-c3ff-46dd-869c-c36d62466f45-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.917653 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" event={"ID":"f3b0231f-c3ff-46dd-869c-c36d62466f45","Type":"ContainerDied","Data":"0ea5f90f3591f96e87616caef133f5fe35988afa5047d2788ddc568cec2e6d6f"} Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.917930 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ea5f90f3591f96e87616caef133f5fe35988afa5047d2788ddc568cec2e6d6f" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.918003 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ldfrh" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.980171 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq"] Oct 01 10:38:25 crc kubenswrapper[4735]: E1001 10:38:25.980546 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b0231f-c3ff-46dd-869c-c36d62466f45" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.980562 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b0231f-c3ff-46dd-869c-c36d62466f45" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.980724 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b0231f-c3ff-46dd-869c-c36d62466f45" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.981275 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.983743 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.983989 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.984270 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.984401 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:38:25 crc kubenswrapper[4735]: I1001 10:38:25.999931 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq"] Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.079023 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7czz\" (UniqueName: \"kubernetes.io/projected/ffacc50e-734b-4e8a-ac0c-a33197ce2351-kube-api-access-n7czz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.079167 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.079224 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.079276 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.181538 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7czz\" (UniqueName: \"kubernetes.io/projected/ffacc50e-734b-4e8a-ac0c-a33197ce2351-kube-api-access-n7czz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.182171 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.182435 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.182685 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.187178 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.187957 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.188964 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.203487 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7czz\" (UniqueName: \"kubernetes.io/projected/ffacc50e-734b-4e8a-ac0c-a33197ce2351-kube-api-access-n7czz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.305353 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.861204 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq"] Oct 01 10:38:26 crc kubenswrapper[4735]: W1001 10:38:26.867101 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffacc50e_734b_4e8a_ac0c_a33197ce2351.slice/crio-e3a0a091020227fd647e3b74dd2b91499ce73124f06e149b8cfda0aa021e629d WatchSource:0}: Error finding container e3a0a091020227fd647e3b74dd2b91499ce73124f06e149b8cfda0aa021e629d: Status 404 returned error can't find the container with id e3a0a091020227fd647e3b74dd2b91499ce73124f06e149b8cfda0aa021e629d Oct 01 10:38:26 crc kubenswrapper[4735]: I1001 10:38:26.934830 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" event={"ID":"ffacc50e-734b-4e8a-ac0c-a33197ce2351","Type":"ContainerStarted","Data":"e3a0a091020227fd647e3b74dd2b91499ce73124f06e149b8cfda0aa021e629d"} Oct 01 10:38:29 crc kubenswrapper[4735]: I1001 10:38:29.979843 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" event={"ID":"ffacc50e-734b-4e8a-ac0c-a33197ce2351","Type":"ContainerStarted","Data":"82b533cabd58e892dd6eedf82dbf37cfb938fc59cba05506e4ef2472fc240cfa"} Oct 01 10:38:30 crc kubenswrapper[4735]: I1001 10:38:30.008361 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" podStartSLOduration=3.20970105 podStartE2EDuration="5.00833687s" podCreationTimestamp="2025-10-01 10:38:25 +0000 UTC" firstStartedPulling="2025-10-01 10:38:26.87224548 +0000 UTC m=+1265.565066752" lastFinishedPulling="2025-10-01 10:38:28.67088127 +0000 UTC m=+1267.363702572" observedRunningTime="2025-10-01 10:38:30.005541134 +0000 UTC m=+1268.698362497" watchObservedRunningTime="2025-10-01 10:38:30.00833687 +0000 UTC m=+1268.701158172" Oct 01 10:39:24 crc kubenswrapper[4735]: I1001 10:39:24.664429 4735 scope.go:117] "RemoveContainer" containerID="e42843086077517482027ea772525efff760b50a4447498be6b743665f10b9d4" Oct 01 10:39:24 crc kubenswrapper[4735]: I1001 10:39:24.692281 4735 scope.go:117] "RemoveContainer" containerID="d05439fcc255e8c49ca48f154e27cb80a2bfc88aaf5d787fc969ae1583ece2f9" Oct 01 10:39:24 crc kubenswrapper[4735]: I1001 10:39:24.739745 4735 scope.go:117] "RemoveContainer" containerID="3c26fba4b03c5d2ead52a8171b3d00ed23efc77ca18ab4fe450fd043e2e25cb4" Oct 01 10:39:24 crc kubenswrapper[4735]: I1001 10:39:24.791859 4735 scope.go:117] "RemoveContainer" containerID="28aa39512cbc18af38aa34dd5f4305a56bffa436d2d1504d64b86e11da46e0c7" Oct 01 10:39:24 crc kubenswrapper[4735]: I1001 10:39:24.816218 4735 scope.go:117] "RemoveContainer" containerID="7a05d0208176e16ca827de0751205940fe05ef3bd18c2ddf1872862f95be80ef" Oct 01 10:39:24 crc kubenswrapper[4735]: I1001 10:39:24.991847 4735 scope.go:117] "RemoveContainer" containerID="e9f5c7f24585fa8ee86eab3944a986db14a7d18437e2e38e048271bf0509211a" Oct 01 10:39:35 crc kubenswrapper[4735]: I1001 10:39:35.485735 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:39:35 crc kubenswrapper[4735]: I1001 10:39:35.486312 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.312424 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tv8kt"] Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.316467 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.327947 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tv8kt"] Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.412623 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-utilities\") pod \"community-operators-tv8kt\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.412884 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-catalog-content\") pod \"community-operators-tv8kt\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.413093 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8xkd\" (UniqueName: \"kubernetes.io/projected/5f717549-5e93-4b10-a7d7-db97a2113233-kube-api-access-x8xkd\") pod \"community-operators-tv8kt\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.515365 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-utilities\") pod \"community-operators-tv8kt\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.515460 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-catalog-content\") pod \"community-operators-tv8kt\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.515544 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8xkd\" (UniqueName: \"kubernetes.io/projected/5f717549-5e93-4b10-a7d7-db97a2113233-kube-api-access-x8xkd\") pod \"community-operators-tv8kt\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.515927 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-utilities\") pod \"community-operators-tv8kt\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.516012 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-catalog-content\") pod \"community-operators-tv8kt\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.533790 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8xkd\" (UniqueName: \"kubernetes.io/projected/5f717549-5e93-4b10-a7d7-db97a2113233-kube-api-access-x8xkd\") pod \"community-operators-tv8kt\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:04 crc kubenswrapper[4735]: I1001 10:40:04.650275 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:05 crc kubenswrapper[4735]: I1001 10:40:05.168127 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tv8kt"] Oct 01 10:40:05 crc kubenswrapper[4735]: I1001 10:40:05.485143 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:40:05 crc kubenswrapper[4735]: I1001 10:40:05.485594 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:40:05 crc kubenswrapper[4735]: E1001 10:40:05.526343 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f717549_5e93_4b10_a7d7_db97a2113233.slice/crio-conmon-0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e.scope\": RecentStats: unable to find data in memory cache]" Oct 01 10:40:06 crc kubenswrapper[4735]: I1001 10:40:06.047537 4735 generic.go:334] "Generic (PLEG): container finished" podID="5f717549-5e93-4b10-a7d7-db97a2113233" containerID="0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e" exitCode=0 Oct 01 10:40:06 crc kubenswrapper[4735]: I1001 10:40:06.047661 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv8kt" event={"ID":"5f717549-5e93-4b10-a7d7-db97a2113233","Type":"ContainerDied","Data":"0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e"} Oct 01 10:40:06 crc kubenswrapper[4735]: I1001 10:40:06.047727 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv8kt" event={"ID":"5f717549-5e93-4b10-a7d7-db97a2113233","Type":"ContainerStarted","Data":"1b2836291c0b3cb11636c1b6b4da8ac24d80c02deb5b73f5350008ce082b9761"} Oct 01 10:40:07 crc kubenswrapper[4735]: I1001 10:40:07.059008 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv8kt" event={"ID":"5f717549-5e93-4b10-a7d7-db97a2113233","Type":"ContainerStarted","Data":"d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a"} Oct 01 10:40:08 crc kubenswrapper[4735]: I1001 10:40:08.073827 4735 generic.go:334] "Generic (PLEG): container finished" podID="5f717549-5e93-4b10-a7d7-db97a2113233" containerID="d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a" exitCode=0 Oct 01 10:40:08 crc kubenswrapper[4735]: I1001 10:40:08.074232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv8kt" event={"ID":"5f717549-5e93-4b10-a7d7-db97a2113233","Type":"ContainerDied","Data":"d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a"} Oct 01 10:40:09 crc kubenswrapper[4735]: I1001 10:40:09.089695 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv8kt" event={"ID":"5f717549-5e93-4b10-a7d7-db97a2113233","Type":"ContainerStarted","Data":"695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a"} Oct 01 10:40:10 crc kubenswrapper[4735]: I1001 10:40:10.128075 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tv8kt" podStartSLOduration=3.416652375 podStartE2EDuration="6.128054726s" podCreationTimestamp="2025-10-01 10:40:04 +0000 UTC" firstStartedPulling="2025-10-01 10:40:06.050133541 +0000 UTC m=+1364.742954843" lastFinishedPulling="2025-10-01 10:40:08.761535892 +0000 UTC m=+1367.454357194" observedRunningTime="2025-10-01 10:40:10.121992718 +0000 UTC m=+1368.814814010" watchObservedRunningTime="2025-10-01 10:40:10.128054726 +0000 UTC m=+1368.820875998" Oct 01 10:40:14 crc kubenswrapper[4735]: I1001 10:40:14.650955 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:14 crc kubenswrapper[4735]: I1001 10:40:14.651673 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:14 crc kubenswrapper[4735]: I1001 10:40:14.723679 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:15 crc kubenswrapper[4735]: I1001 10:40:15.224300 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:15 crc kubenswrapper[4735]: I1001 10:40:15.301866 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tv8kt"] Oct 01 10:40:17 crc kubenswrapper[4735]: I1001 10:40:17.178485 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tv8kt" podUID="5f717549-5e93-4b10-a7d7-db97a2113233" containerName="registry-server" containerID="cri-o://695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a" gracePeriod=2 Oct 01 10:40:17 crc kubenswrapper[4735]: I1001 10:40:17.681580 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:17 crc kubenswrapper[4735]: I1001 10:40:17.796289 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8xkd\" (UniqueName: \"kubernetes.io/projected/5f717549-5e93-4b10-a7d7-db97a2113233-kube-api-access-x8xkd\") pod \"5f717549-5e93-4b10-a7d7-db97a2113233\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " Oct 01 10:40:17 crc kubenswrapper[4735]: I1001 10:40:17.796580 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-catalog-content\") pod \"5f717549-5e93-4b10-a7d7-db97a2113233\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " Oct 01 10:40:17 crc kubenswrapper[4735]: I1001 10:40:17.796683 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-utilities\") pod \"5f717549-5e93-4b10-a7d7-db97a2113233\" (UID: \"5f717549-5e93-4b10-a7d7-db97a2113233\") " Oct 01 10:40:17 crc kubenswrapper[4735]: I1001 10:40:17.798133 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-utilities" (OuterVolumeSpecName: "utilities") pod "5f717549-5e93-4b10-a7d7-db97a2113233" (UID: "5f717549-5e93-4b10-a7d7-db97a2113233"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:40:17 crc kubenswrapper[4735]: I1001 10:40:17.799125 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:40:17 crc kubenswrapper[4735]: I1001 10:40:17.803220 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f717549-5e93-4b10-a7d7-db97a2113233-kube-api-access-x8xkd" (OuterVolumeSpecName: "kube-api-access-x8xkd") pod "5f717549-5e93-4b10-a7d7-db97a2113233" (UID: "5f717549-5e93-4b10-a7d7-db97a2113233"). InnerVolumeSpecName "kube-api-access-x8xkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:40:17 crc kubenswrapper[4735]: I1001 10:40:17.900622 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8xkd\" (UniqueName: \"kubernetes.io/projected/5f717549-5e93-4b10-a7d7-db97a2113233-kube-api-access-x8xkd\") on node \"crc\" DevicePath \"\"" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.195735 4735 generic.go:334] "Generic (PLEG): container finished" podID="5f717549-5e93-4b10-a7d7-db97a2113233" containerID="695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a" exitCode=0 Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.195778 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv8kt" event={"ID":"5f717549-5e93-4b10-a7d7-db97a2113233","Type":"ContainerDied","Data":"695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a"} Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.195809 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tv8kt" event={"ID":"5f717549-5e93-4b10-a7d7-db97a2113233","Type":"ContainerDied","Data":"1b2836291c0b3cb11636c1b6b4da8ac24d80c02deb5b73f5350008ce082b9761"} Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.195825 4735 scope.go:117] "RemoveContainer" containerID="695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.195857 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tv8kt" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.221114 4735 scope.go:117] "RemoveContainer" containerID="d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.248592 4735 scope.go:117] "RemoveContainer" containerID="0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.326162 4735 scope.go:117] "RemoveContainer" containerID="695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a" Oct 01 10:40:18 crc kubenswrapper[4735]: E1001 10:40:18.327022 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a\": container with ID starting with 695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a not found: ID does not exist" containerID="695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.327143 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a"} err="failed to get container status \"695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a\": rpc error: code = NotFound desc = could not find container \"695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a\": container with ID starting with 695feb64087c1f0d116a2402cf5a052d2165bfd95632407fd393b3c282865c4a not found: ID does not exist" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.327265 4735 scope.go:117] "RemoveContainer" containerID="d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a" Oct 01 10:40:18 crc kubenswrapper[4735]: E1001 10:40:18.327685 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a\": container with ID starting with d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a not found: ID does not exist" containerID="d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.327715 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a"} err="failed to get container status \"d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a\": rpc error: code = NotFound desc = could not find container \"d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a\": container with ID starting with d23ef77a3ca43fa802e07d02a2faba5f89277d0331d729ef2a60b115edf88d8a not found: ID does not exist" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.327735 4735 scope.go:117] "RemoveContainer" containerID="0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e" Oct 01 10:40:18 crc kubenswrapper[4735]: E1001 10:40:18.328158 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e\": container with ID starting with 0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e not found: ID does not exist" containerID="0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.328277 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e"} err="failed to get container status \"0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e\": rpc error: code = NotFound desc = could not find container \"0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e\": container with ID starting with 0253d7031cc8b4caaea8420db599975830feae84e1c0d5fb5ef1fcdb2b8f1e9e not found: ID does not exist" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.529896 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f717549-5e93-4b10-a7d7-db97a2113233" (UID: "5f717549-5e93-4b10-a7d7-db97a2113233"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.614923 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f717549-5e93-4b10-a7d7-db97a2113233-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.825283 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tv8kt"] Oct 01 10:40:18 crc kubenswrapper[4735]: I1001 10:40:18.833878 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tv8kt"] Oct 01 10:40:19 crc kubenswrapper[4735]: I1001 10:40:19.911717 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f717549-5e93-4b10-a7d7-db97a2113233" path="/var/lib/kubelet/pods/5f717549-5e93-4b10-a7d7-db97a2113233/volumes" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.382219 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t6swg"] Oct 01 10:40:20 crc kubenswrapper[4735]: E1001 10:40:20.384373 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f717549-5e93-4b10-a7d7-db97a2113233" containerName="extract-utilities" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.384553 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f717549-5e93-4b10-a7d7-db97a2113233" containerName="extract-utilities" Oct 01 10:40:20 crc kubenswrapper[4735]: E1001 10:40:20.384778 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f717549-5e93-4b10-a7d7-db97a2113233" containerName="extract-content" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.384922 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f717549-5e93-4b10-a7d7-db97a2113233" containerName="extract-content" Oct 01 10:40:20 crc kubenswrapper[4735]: E1001 10:40:20.385067 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f717549-5e93-4b10-a7d7-db97a2113233" containerName="registry-server" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.385206 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f717549-5e93-4b10-a7d7-db97a2113233" containerName="registry-server" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.385864 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f717549-5e93-4b10-a7d7-db97a2113233" containerName="registry-server" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.389559 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.398359 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6swg"] Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.449613 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-catalog-content\") pod \"certified-operators-t6swg\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.449853 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwb7g\" (UniqueName: \"kubernetes.io/projected/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-kube-api-access-dwb7g\") pod \"certified-operators-t6swg\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.449973 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-utilities\") pod \"certified-operators-t6swg\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.552394 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-catalog-content\") pod \"certified-operators-t6swg\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.552682 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwb7g\" (UniqueName: \"kubernetes.io/projected/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-kube-api-access-dwb7g\") pod \"certified-operators-t6swg\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.552839 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-utilities\") pod \"certified-operators-t6swg\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.553111 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-catalog-content\") pod \"certified-operators-t6swg\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.553725 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-utilities\") pod \"certified-operators-t6swg\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.588559 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwb7g\" (UniqueName: \"kubernetes.io/projected/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-kube-api-access-dwb7g\") pod \"certified-operators-t6swg\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:20 crc kubenswrapper[4735]: I1001 10:40:20.716537 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:21 crc kubenswrapper[4735]: I1001 10:40:21.190279 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6swg"] Oct 01 10:40:21 crc kubenswrapper[4735]: I1001 10:40:21.234288 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6swg" event={"ID":"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf","Type":"ContainerStarted","Data":"2805c7ddda71ef13b59c33e0f3ef3e67ce4fad7be191e58e86b65f4fd1654dfe"} Oct 01 10:40:22 crc kubenswrapper[4735]: I1001 10:40:22.246955 4735 generic.go:334] "Generic (PLEG): container finished" podID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" containerID="c2a6ae124754076fcf9039c2ccad705d4a025fb428c5ea30e54533d8b9b33541" exitCode=0 Oct 01 10:40:22 crc kubenswrapper[4735]: I1001 10:40:22.247066 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6swg" event={"ID":"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf","Type":"ContainerDied","Data":"c2a6ae124754076fcf9039c2ccad705d4a025fb428c5ea30e54533d8b9b33541"} Oct 01 10:40:24 crc kubenswrapper[4735]: I1001 10:40:24.270898 4735 generic.go:334] "Generic (PLEG): container finished" podID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" containerID="13d7b7f7b1a0c2d1ab3e9c3466d2b945a9b33cdb8af1c4933a97904781b49748" exitCode=0 Oct 01 10:40:24 crc kubenswrapper[4735]: I1001 10:40:24.270996 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6swg" event={"ID":"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf","Type":"ContainerDied","Data":"13d7b7f7b1a0c2d1ab3e9c3466d2b945a9b33cdb8af1c4933a97904781b49748"} Oct 01 10:40:25 crc kubenswrapper[4735]: I1001 10:40:25.283828 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6swg" event={"ID":"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf","Type":"ContainerStarted","Data":"d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892"} Oct 01 10:40:25 crc kubenswrapper[4735]: I1001 10:40:25.305778 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t6swg" podStartSLOduration=2.801201454 podStartE2EDuration="5.305755943s" podCreationTimestamp="2025-10-01 10:40:20 +0000 UTC" firstStartedPulling="2025-10-01 10:40:22.248687994 +0000 UTC m=+1380.941509256" lastFinishedPulling="2025-10-01 10:40:24.753242473 +0000 UTC m=+1383.446063745" observedRunningTime="2025-10-01 10:40:25.302383055 +0000 UTC m=+1383.995204317" watchObservedRunningTime="2025-10-01 10:40:25.305755943 +0000 UTC m=+1383.998577205" Oct 01 10:40:30 crc kubenswrapper[4735]: I1001 10:40:30.717524 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:30 crc kubenswrapper[4735]: I1001 10:40:30.717995 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:30 crc kubenswrapper[4735]: I1001 10:40:30.787169 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:31 crc kubenswrapper[4735]: I1001 10:40:31.395481 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:31 crc kubenswrapper[4735]: I1001 10:40:31.446135 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6swg"] Oct 01 10:40:33 crc kubenswrapper[4735]: I1001 10:40:33.362904 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t6swg" podUID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" containerName="registry-server" containerID="cri-o://d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892" gracePeriod=2 Oct 01 10:40:33 crc kubenswrapper[4735]: I1001 10:40:33.817208 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:33 crc kubenswrapper[4735]: I1001 10:40:33.907804 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-catalog-content\") pod \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " Oct 01 10:40:33 crc kubenswrapper[4735]: I1001 10:40:33.907891 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-utilities\") pod \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " Oct 01 10:40:33 crc kubenswrapper[4735]: I1001 10:40:33.908169 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwb7g\" (UniqueName: \"kubernetes.io/projected/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-kube-api-access-dwb7g\") pod \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\" (UID: \"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf\") " Oct 01 10:40:33 crc kubenswrapper[4735]: I1001 10:40:33.909524 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-utilities" (OuterVolumeSpecName: "utilities") pod "1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" (UID: "1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:40:33 crc kubenswrapper[4735]: I1001 10:40:33.915185 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-kube-api-access-dwb7g" (OuterVolumeSpecName: "kube-api-access-dwb7g") pod "1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" (UID: "1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf"). InnerVolumeSpecName "kube-api-access-dwb7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:40:33 crc kubenswrapper[4735]: I1001 10:40:33.969318 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" (UID: "1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.011603 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwb7g\" (UniqueName: \"kubernetes.io/projected/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-kube-api-access-dwb7g\") on node \"crc\" DevicePath \"\"" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.012005 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.012020 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.377943 4735 generic.go:334] "Generic (PLEG): container finished" podID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" containerID="d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892" exitCode=0 Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.378004 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6swg" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.378006 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6swg" event={"ID":"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf","Type":"ContainerDied","Data":"d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892"} Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.378079 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6swg" event={"ID":"1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf","Type":"ContainerDied","Data":"2805c7ddda71ef13b59c33e0f3ef3e67ce4fad7be191e58e86b65f4fd1654dfe"} Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.378109 4735 scope.go:117] "RemoveContainer" containerID="d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.421717 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6swg"] Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.427974 4735 scope.go:117] "RemoveContainer" containerID="13d7b7f7b1a0c2d1ab3e9c3466d2b945a9b33cdb8af1c4933a97904781b49748" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.431800 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t6swg"] Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.452916 4735 scope.go:117] "RemoveContainer" containerID="c2a6ae124754076fcf9039c2ccad705d4a025fb428c5ea30e54533d8b9b33541" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.508810 4735 scope.go:117] "RemoveContainer" containerID="d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892" Oct 01 10:40:34 crc kubenswrapper[4735]: E1001 10:40:34.509563 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892\": container with ID starting with d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892 not found: ID does not exist" containerID="d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.509603 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892"} err="failed to get container status \"d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892\": rpc error: code = NotFound desc = could not find container \"d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892\": container with ID starting with d568b2096945c2c6714dfb26f14e522e5ab1d283556eb94600735d8ba05e0892 not found: ID does not exist" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.509636 4735 scope.go:117] "RemoveContainer" containerID="13d7b7f7b1a0c2d1ab3e9c3466d2b945a9b33cdb8af1c4933a97904781b49748" Oct 01 10:40:34 crc kubenswrapper[4735]: E1001 10:40:34.510164 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d7b7f7b1a0c2d1ab3e9c3466d2b945a9b33cdb8af1c4933a97904781b49748\": container with ID starting with 13d7b7f7b1a0c2d1ab3e9c3466d2b945a9b33cdb8af1c4933a97904781b49748 not found: ID does not exist" containerID="13d7b7f7b1a0c2d1ab3e9c3466d2b945a9b33cdb8af1c4933a97904781b49748" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.510186 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d7b7f7b1a0c2d1ab3e9c3466d2b945a9b33cdb8af1c4933a97904781b49748"} err="failed to get container status \"13d7b7f7b1a0c2d1ab3e9c3466d2b945a9b33cdb8af1c4933a97904781b49748\": rpc error: code = NotFound desc = could not find container \"13d7b7f7b1a0c2d1ab3e9c3466d2b945a9b33cdb8af1c4933a97904781b49748\": container with ID starting with 13d7b7f7b1a0c2d1ab3e9c3466d2b945a9b33cdb8af1c4933a97904781b49748 not found: ID does not exist" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.510202 4735 scope.go:117] "RemoveContainer" containerID="c2a6ae124754076fcf9039c2ccad705d4a025fb428c5ea30e54533d8b9b33541" Oct 01 10:40:34 crc kubenswrapper[4735]: E1001 10:40:34.510565 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2a6ae124754076fcf9039c2ccad705d4a025fb428c5ea30e54533d8b9b33541\": container with ID starting with c2a6ae124754076fcf9039c2ccad705d4a025fb428c5ea30e54533d8b9b33541 not found: ID does not exist" containerID="c2a6ae124754076fcf9039c2ccad705d4a025fb428c5ea30e54533d8b9b33541" Oct 01 10:40:34 crc kubenswrapper[4735]: I1001 10:40:34.510590 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a6ae124754076fcf9039c2ccad705d4a025fb428c5ea30e54533d8b9b33541"} err="failed to get container status \"c2a6ae124754076fcf9039c2ccad705d4a025fb428c5ea30e54533d8b9b33541\": rpc error: code = NotFound desc = could not find container \"c2a6ae124754076fcf9039c2ccad705d4a025fb428c5ea30e54533d8b9b33541\": container with ID starting with c2a6ae124754076fcf9039c2ccad705d4a025fb428c5ea30e54533d8b9b33541 not found: ID does not exist" Oct 01 10:40:35 crc kubenswrapper[4735]: I1001 10:40:35.486530 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:40:35 crc kubenswrapper[4735]: I1001 10:40:35.486646 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:40:35 crc kubenswrapper[4735]: I1001 10:40:35.486725 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:40:35 crc kubenswrapper[4735]: I1001 10:40:35.487995 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32fd47d6848f87a0586764661e897b70eae54a447c3515a2ff59f02148ba9b6a"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 10:40:35 crc kubenswrapper[4735]: I1001 10:40:35.488103 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://32fd47d6848f87a0586764661e897b70eae54a447c3515a2ff59f02148ba9b6a" gracePeriod=600 Oct 01 10:40:35 crc kubenswrapper[4735]: I1001 10:40:35.910684 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" path="/var/lib/kubelet/pods/1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf/volumes" Oct 01 10:40:36 crc kubenswrapper[4735]: I1001 10:40:36.415064 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="32fd47d6848f87a0586764661e897b70eae54a447c3515a2ff59f02148ba9b6a" exitCode=0 Oct 01 10:40:36 crc kubenswrapper[4735]: I1001 10:40:36.415108 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"32fd47d6848f87a0586764661e897b70eae54a447c3515a2ff59f02148ba9b6a"} Oct 01 10:40:36 crc kubenswrapper[4735]: I1001 10:40:36.415909 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba"} Oct 01 10:40:36 crc kubenswrapper[4735]: I1001 10:40:36.415940 4735 scope.go:117] "RemoveContainer" containerID="6042cbc8542ef3e83bd7d2006832c7a2d565a12a6c7538fc83d415163087f591" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.519975 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gftbk"] Oct 01 10:40:39 crc kubenswrapper[4735]: E1001 10:40:39.521295 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" containerName="extract-content" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.521320 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" containerName="extract-content" Oct 01 10:40:39 crc kubenswrapper[4735]: E1001 10:40:39.521341 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" containerName="extract-utilities" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.521354 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" containerName="extract-utilities" Oct 01 10:40:39 crc kubenswrapper[4735]: E1001 10:40:39.521406 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" containerName="registry-server" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.521420 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" containerName="registry-server" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.521775 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d88c56d-d1eb-45f1-acfa-8dd05f1e93cf" containerName="registry-server" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.549732 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gftbk"] Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.549908 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.646547 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-catalog-content\") pod \"redhat-marketplace-gftbk\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.647576 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp72x\" (UniqueName: \"kubernetes.io/projected/e7e424b8-6b44-498e-9f6b-064a4c857cfb-kube-api-access-dp72x\") pod \"redhat-marketplace-gftbk\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.647674 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-utilities\") pod \"redhat-marketplace-gftbk\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.749865 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp72x\" (UniqueName: \"kubernetes.io/projected/e7e424b8-6b44-498e-9f6b-064a4c857cfb-kube-api-access-dp72x\") pod \"redhat-marketplace-gftbk\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.749914 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-utilities\") pod \"redhat-marketplace-gftbk\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.749974 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-catalog-content\") pod \"redhat-marketplace-gftbk\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.750447 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-catalog-content\") pod \"redhat-marketplace-gftbk\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.750560 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-utilities\") pod \"redhat-marketplace-gftbk\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.775571 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp72x\" (UniqueName: \"kubernetes.io/projected/e7e424b8-6b44-498e-9f6b-064a4c857cfb-kube-api-access-dp72x\") pod \"redhat-marketplace-gftbk\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:39 crc kubenswrapper[4735]: I1001 10:40:39.884638 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:40 crc kubenswrapper[4735]: I1001 10:40:40.374704 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gftbk"] Oct 01 10:40:40 crc kubenswrapper[4735]: W1001 10:40:40.377626 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7e424b8_6b44_498e_9f6b_064a4c857cfb.slice/crio-8c6bb5fc458119e99e23a330968dcc09b54b3f8ea9f160225b7101f9f99a15e0 WatchSource:0}: Error finding container 8c6bb5fc458119e99e23a330968dcc09b54b3f8ea9f160225b7101f9f99a15e0: Status 404 returned error can't find the container with id 8c6bb5fc458119e99e23a330968dcc09b54b3f8ea9f160225b7101f9f99a15e0 Oct 01 10:40:40 crc kubenswrapper[4735]: I1001 10:40:40.464569 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gftbk" event={"ID":"e7e424b8-6b44-498e-9f6b-064a4c857cfb","Type":"ContainerStarted","Data":"8c6bb5fc458119e99e23a330968dcc09b54b3f8ea9f160225b7101f9f99a15e0"} Oct 01 10:40:41 crc kubenswrapper[4735]: I1001 10:40:41.475460 4735 generic.go:334] "Generic (PLEG): container finished" podID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" containerID="515e0fc12bdc66240cfac33d8bac323ac993eb6f1e9e28ccc6a5783840f7d0e4" exitCode=0 Oct 01 10:40:41 crc kubenswrapper[4735]: I1001 10:40:41.475533 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gftbk" event={"ID":"e7e424b8-6b44-498e-9f6b-064a4c857cfb","Type":"ContainerDied","Data":"515e0fc12bdc66240cfac33d8bac323ac993eb6f1e9e28ccc6a5783840f7d0e4"} Oct 01 10:40:42 crc kubenswrapper[4735]: I1001 10:40:42.491232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gftbk" event={"ID":"e7e424b8-6b44-498e-9f6b-064a4c857cfb","Type":"ContainerDied","Data":"6ef457741eef72e00719b789778f617073c3f12e78dc047369d8563e60ae3b28"} Oct 01 10:40:42 crc kubenswrapper[4735]: I1001 10:40:42.491047 4735 generic.go:334] "Generic (PLEG): container finished" podID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" containerID="6ef457741eef72e00719b789778f617073c3f12e78dc047369d8563e60ae3b28" exitCode=0 Oct 01 10:40:43 crc kubenswrapper[4735]: I1001 10:40:43.506528 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gftbk" event={"ID":"e7e424b8-6b44-498e-9f6b-064a4c857cfb","Type":"ContainerStarted","Data":"24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256"} Oct 01 10:40:43 crc kubenswrapper[4735]: I1001 10:40:43.542880 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gftbk" podStartSLOduration=3.070825322 podStartE2EDuration="4.542853902s" podCreationTimestamp="2025-10-01 10:40:39 +0000 UTC" firstStartedPulling="2025-10-01 10:40:41.477868702 +0000 UTC m=+1400.170689964" lastFinishedPulling="2025-10-01 10:40:42.949897252 +0000 UTC m=+1401.642718544" observedRunningTime="2025-10-01 10:40:43.534070672 +0000 UTC m=+1402.226891974" watchObservedRunningTime="2025-10-01 10:40:43.542853902 +0000 UTC m=+1402.235675194" Oct 01 10:40:49 crc kubenswrapper[4735]: I1001 10:40:49.884853 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:49 crc kubenswrapper[4735]: I1001 10:40:49.885381 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:49 crc kubenswrapper[4735]: I1001 10:40:49.935077 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:50 crc kubenswrapper[4735]: I1001 10:40:50.628761 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:51 crc kubenswrapper[4735]: I1001 10:40:51.179053 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gftbk"] Oct 01 10:40:52 crc kubenswrapper[4735]: I1001 10:40:52.009062 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-696cd688cf-kqrbf" podUID="95643272-0db0-4c04-9087-98321b57c893" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 01 10:40:52 crc kubenswrapper[4735]: I1001 10:40:52.599678 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gftbk" podUID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" containerName="registry-server" containerID="cri-o://24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256" gracePeriod=2 Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.100233 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.134997 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-catalog-content\") pod \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.135096 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-utilities\") pod \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.135201 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp72x\" (UniqueName: \"kubernetes.io/projected/e7e424b8-6b44-498e-9f6b-064a4c857cfb-kube-api-access-dp72x\") pod \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\" (UID: \"e7e424b8-6b44-498e-9f6b-064a4c857cfb\") " Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.136899 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-utilities" (OuterVolumeSpecName: "utilities") pod "e7e424b8-6b44-498e-9f6b-064a4c857cfb" (UID: "e7e424b8-6b44-498e-9f6b-064a4c857cfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.142645 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e424b8-6b44-498e-9f6b-064a4c857cfb-kube-api-access-dp72x" (OuterVolumeSpecName: "kube-api-access-dp72x") pod "e7e424b8-6b44-498e-9f6b-064a4c857cfb" (UID: "e7e424b8-6b44-498e-9f6b-064a4c857cfb"). InnerVolumeSpecName "kube-api-access-dp72x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.155961 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7e424b8-6b44-498e-9f6b-064a4c857cfb" (UID: "e7e424b8-6b44-498e-9f6b-064a4c857cfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.238109 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.238154 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp72x\" (UniqueName: \"kubernetes.io/projected/e7e424b8-6b44-498e-9f6b-064a4c857cfb-kube-api-access-dp72x\") on node \"crc\" DevicePath \"\"" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.238173 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7e424b8-6b44-498e-9f6b-064a4c857cfb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.613596 4735 generic.go:334] "Generic (PLEG): container finished" podID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" containerID="24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256" exitCode=0 Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.613655 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gftbk" event={"ID":"e7e424b8-6b44-498e-9f6b-064a4c857cfb","Type":"ContainerDied","Data":"24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256"} Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.613831 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gftbk" event={"ID":"e7e424b8-6b44-498e-9f6b-064a4c857cfb","Type":"ContainerDied","Data":"8c6bb5fc458119e99e23a330968dcc09b54b3f8ea9f160225b7101f9f99a15e0"} Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.613691 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gftbk" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.613891 4735 scope.go:117] "RemoveContainer" containerID="24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.638568 4735 scope.go:117] "RemoveContainer" containerID="6ef457741eef72e00719b789778f617073c3f12e78dc047369d8563e60ae3b28" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.670465 4735 scope.go:117] "RemoveContainer" containerID="515e0fc12bdc66240cfac33d8bac323ac993eb6f1e9e28ccc6a5783840f7d0e4" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.671257 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gftbk"] Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.680902 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gftbk"] Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.720112 4735 scope.go:117] "RemoveContainer" containerID="24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256" Oct 01 10:40:53 crc kubenswrapper[4735]: E1001 10:40:53.720626 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256\": container with ID starting with 24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256 not found: ID does not exist" containerID="24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.720686 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256"} err="failed to get container status \"24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256\": rpc error: code = NotFound desc = could not find container \"24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256\": container with ID starting with 24cd1773dcaabeb1db93d61973a9491e1654830f665429f5de9041682d121256 not found: ID does not exist" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.720720 4735 scope.go:117] "RemoveContainer" containerID="6ef457741eef72e00719b789778f617073c3f12e78dc047369d8563e60ae3b28" Oct 01 10:40:53 crc kubenswrapper[4735]: E1001 10:40:53.721377 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef457741eef72e00719b789778f617073c3f12e78dc047369d8563e60ae3b28\": container with ID starting with 6ef457741eef72e00719b789778f617073c3f12e78dc047369d8563e60ae3b28 not found: ID does not exist" containerID="6ef457741eef72e00719b789778f617073c3f12e78dc047369d8563e60ae3b28" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.721424 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef457741eef72e00719b789778f617073c3f12e78dc047369d8563e60ae3b28"} err="failed to get container status \"6ef457741eef72e00719b789778f617073c3f12e78dc047369d8563e60ae3b28\": rpc error: code = NotFound desc = could not find container \"6ef457741eef72e00719b789778f617073c3f12e78dc047369d8563e60ae3b28\": container with ID starting with 6ef457741eef72e00719b789778f617073c3f12e78dc047369d8563e60ae3b28 not found: ID does not exist" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.721457 4735 scope.go:117] "RemoveContainer" containerID="515e0fc12bdc66240cfac33d8bac323ac993eb6f1e9e28ccc6a5783840f7d0e4" Oct 01 10:40:53 crc kubenswrapper[4735]: E1001 10:40:53.721766 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515e0fc12bdc66240cfac33d8bac323ac993eb6f1e9e28ccc6a5783840f7d0e4\": container with ID starting with 515e0fc12bdc66240cfac33d8bac323ac993eb6f1e9e28ccc6a5783840f7d0e4 not found: ID does not exist" containerID="515e0fc12bdc66240cfac33d8bac323ac993eb6f1e9e28ccc6a5783840f7d0e4" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.721827 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515e0fc12bdc66240cfac33d8bac323ac993eb6f1e9e28ccc6a5783840f7d0e4"} err="failed to get container status \"515e0fc12bdc66240cfac33d8bac323ac993eb6f1e9e28ccc6a5783840f7d0e4\": rpc error: code = NotFound desc = could not find container \"515e0fc12bdc66240cfac33d8bac323ac993eb6f1e9e28ccc6a5783840f7d0e4\": container with ID starting with 515e0fc12bdc66240cfac33d8bac323ac993eb6f1e9e28ccc6a5783840f7d0e4 not found: ID does not exist" Oct 01 10:40:53 crc kubenswrapper[4735]: I1001 10:40:53.916600 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" path="/var/lib/kubelet/pods/e7e424b8-6b44-498e-9f6b-064a4c857cfb/volumes" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.387130 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2xktj"] Oct 01 10:41:05 crc kubenswrapper[4735]: E1001 10:41:05.388390 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" containerName="registry-server" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.388407 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" containerName="registry-server" Oct 01 10:41:05 crc kubenswrapper[4735]: E1001 10:41:05.388433 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" containerName="extract-content" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.388441 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" containerName="extract-content" Oct 01 10:41:05 crc kubenswrapper[4735]: E1001 10:41:05.388470 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" containerName="extract-utilities" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.388480 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" containerName="extract-utilities" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.388740 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e424b8-6b44-498e-9f6b-064a4c857cfb" containerName="registry-server" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.390398 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.405209 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2xktj"] Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.479710 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn4xv\" (UniqueName: \"kubernetes.io/projected/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-kube-api-access-mn4xv\") pod \"redhat-operators-2xktj\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.479781 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-utilities\") pod \"redhat-operators-2xktj\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.480010 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-catalog-content\") pod \"redhat-operators-2xktj\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.581630 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn4xv\" (UniqueName: \"kubernetes.io/projected/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-kube-api-access-mn4xv\") pod \"redhat-operators-2xktj\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.581901 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-utilities\") pod \"redhat-operators-2xktj\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.582026 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-catalog-content\") pod \"redhat-operators-2xktj\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.583073 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-utilities\") pod \"redhat-operators-2xktj\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.583008 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-catalog-content\") pod \"redhat-operators-2xktj\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.601556 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn4xv\" (UniqueName: \"kubernetes.io/projected/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-kube-api-access-mn4xv\") pod \"redhat-operators-2xktj\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:05 crc kubenswrapper[4735]: I1001 10:41:05.717139 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:06 crc kubenswrapper[4735]: I1001 10:41:06.151287 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2xktj"] Oct 01 10:41:06 crc kubenswrapper[4735]: I1001 10:41:06.737158 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" containerID="e533718eebb2dc4a1d2366cac695aa9229b5fbab86ae588b9ccb97b692caa6f9" exitCode=0 Oct 01 10:41:06 crc kubenswrapper[4735]: I1001 10:41:06.737207 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xktj" event={"ID":"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f","Type":"ContainerDied","Data":"e533718eebb2dc4a1d2366cac695aa9229b5fbab86ae588b9ccb97b692caa6f9"} Oct 01 10:41:06 crc kubenswrapper[4735]: I1001 10:41:06.737451 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xktj" event={"ID":"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f","Type":"ContainerStarted","Data":"c72a49eaa98a5aec01aa2dc87b0af4277a3fc2fb06d668b43bbedb5a7a5ddaee"} Oct 01 10:41:08 crc kubenswrapper[4735]: I1001 10:41:08.757570 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" containerID="37147b126b03e41ded0295aaf73bcc756a5fafe5fb5ca112f986dc06cce00dd0" exitCode=0 Oct 01 10:41:08 crc kubenswrapper[4735]: I1001 10:41:08.757615 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xktj" event={"ID":"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f","Type":"ContainerDied","Data":"37147b126b03e41ded0295aaf73bcc756a5fafe5fb5ca112f986dc06cce00dd0"} Oct 01 10:41:10 crc kubenswrapper[4735]: I1001 10:41:10.785631 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xktj" event={"ID":"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f","Type":"ContainerStarted","Data":"95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8"} Oct 01 10:41:10 crc kubenswrapper[4735]: I1001 10:41:10.817333 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2xktj" podStartSLOduration=2.655674397 podStartE2EDuration="5.817312892s" podCreationTimestamp="2025-10-01 10:41:05 +0000 UTC" firstStartedPulling="2025-10-01 10:41:06.739086907 +0000 UTC m=+1425.431908169" lastFinishedPulling="2025-10-01 10:41:09.900725382 +0000 UTC m=+1428.593546664" observedRunningTime="2025-10-01 10:41:10.808054456 +0000 UTC m=+1429.500875718" watchObservedRunningTime="2025-10-01 10:41:10.817312892 +0000 UTC m=+1429.510134164" Oct 01 10:41:15 crc kubenswrapper[4735]: I1001 10:41:15.717735 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:15 crc kubenswrapper[4735]: I1001 10:41:15.718336 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:15 crc kubenswrapper[4735]: I1001 10:41:15.792799 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:15 crc kubenswrapper[4735]: I1001 10:41:15.889453 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:16 crc kubenswrapper[4735]: I1001 10:41:16.037373 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2xktj"] Oct 01 10:41:17 crc kubenswrapper[4735]: I1001 10:41:17.851241 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2xktj" podUID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" containerName="registry-server" containerID="cri-o://95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8" gracePeriod=2 Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.330278 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.341484 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-catalog-content\") pod \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.341599 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn4xv\" (UniqueName: \"kubernetes.io/projected/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-kube-api-access-mn4xv\") pod \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.341870 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-utilities\") pod \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\" (UID: \"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f\") " Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.342663 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-utilities" (OuterVolumeSpecName: "utilities") pod "ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" (UID: "ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.347142 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-kube-api-access-mn4xv" (OuterVolumeSpecName: "kube-api-access-mn4xv") pod "ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" (UID: "ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f"). InnerVolumeSpecName "kube-api-access-mn4xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.426997 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" (UID: "ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.444736 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.444778 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn4xv\" (UniqueName: \"kubernetes.io/projected/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-kube-api-access-mn4xv\") on node \"crc\" DevicePath \"\"" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.444790 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.872554 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" containerID="95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8" exitCode=0 Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.872603 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xktj" event={"ID":"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f","Type":"ContainerDied","Data":"95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8"} Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.872629 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xktj" event={"ID":"ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f","Type":"ContainerDied","Data":"c72a49eaa98a5aec01aa2dc87b0af4277a3fc2fb06d668b43bbedb5a7a5ddaee"} Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.872645 4735 scope.go:117] "RemoveContainer" containerID="95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.872786 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xktj" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.903927 4735 scope.go:117] "RemoveContainer" containerID="37147b126b03e41ded0295aaf73bcc756a5fafe5fb5ca112f986dc06cce00dd0" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.915882 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2xktj"] Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.930880 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2xktj"] Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.938137 4735 scope.go:117] "RemoveContainer" containerID="e533718eebb2dc4a1d2366cac695aa9229b5fbab86ae588b9ccb97b692caa6f9" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.967324 4735 scope.go:117] "RemoveContainer" containerID="95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8" Oct 01 10:41:18 crc kubenswrapper[4735]: E1001 10:41:18.967776 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8\": container with ID starting with 95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8 not found: ID does not exist" containerID="95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.967807 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8"} err="failed to get container status \"95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8\": rpc error: code = NotFound desc = could not find container \"95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8\": container with ID starting with 95304761a75d34989a1af4d55310561de3db82a478525a8144e53ee31bffaca8 not found: ID does not exist" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.967827 4735 scope.go:117] "RemoveContainer" containerID="37147b126b03e41ded0295aaf73bcc756a5fafe5fb5ca112f986dc06cce00dd0" Oct 01 10:41:18 crc kubenswrapper[4735]: E1001 10:41:18.968336 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37147b126b03e41ded0295aaf73bcc756a5fafe5fb5ca112f986dc06cce00dd0\": container with ID starting with 37147b126b03e41ded0295aaf73bcc756a5fafe5fb5ca112f986dc06cce00dd0 not found: ID does not exist" containerID="37147b126b03e41ded0295aaf73bcc756a5fafe5fb5ca112f986dc06cce00dd0" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.968364 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37147b126b03e41ded0295aaf73bcc756a5fafe5fb5ca112f986dc06cce00dd0"} err="failed to get container status \"37147b126b03e41ded0295aaf73bcc756a5fafe5fb5ca112f986dc06cce00dd0\": rpc error: code = NotFound desc = could not find container \"37147b126b03e41ded0295aaf73bcc756a5fafe5fb5ca112f986dc06cce00dd0\": container with ID starting with 37147b126b03e41ded0295aaf73bcc756a5fafe5fb5ca112f986dc06cce00dd0 not found: ID does not exist" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.968392 4735 scope.go:117] "RemoveContainer" containerID="e533718eebb2dc4a1d2366cac695aa9229b5fbab86ae588b9ccb97b692caa6f9" Oct 01 10:41:18 crc kubenswrapper[4735]: E1001 10:41:18.968657 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e533718eebb2dc4a1d2366cac695aa9229b5fbab86ae588b9ccb97b692caa6f9\": container with ID starting with e533718eebb2dc4a1d2366cac695aa9229b5fbab86ae588b9ccb97b692caa6f9 not found: ID does not exist" containerID="e533718eebb2dc4a1d2366cac695aa9229b5fbab86ae588b9ccb97b692caa6f9" Oct 01 10:41:18 crc kubenswrapper[4735]: I1001 10:41:18.968677 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e533718eebb2dc4a1d2366cac695aa9229b5fbab86ae588b9ccb97b692caa6f9"} err="failed to get container status \"e533718eebb2dc4a1d2366cac695aa9229b5fbab86ae588b9ccb97b692caa6f9\": rpc error: code = NotFound desc = could not find container \"e533718eebb2dc4a1d2366cac695aa9229b5fbab86ae588b9ccb97b692caa6f9\": container with ID starting with e533718eebb2dc4a1d2366cac695aa9229b5fbab86ae588b9ccb97b692caa6f9 not found: ID does not exist" Oct 01 10:41:19 crc kubenswrapper[4735]: I1001 10:41:19.908882 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" path="/var/lib/kubelet/pods/ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f/volumes" Oct 01 10:41:33 crc kubenswrapper[4735]: I1001 10:41:33.018661 4735 generic.go:334] "Generic (PLEG): container finished" podID="ffacc50e-734b-4e8a-ac0c-a33197ce2351" containerID="82b533cabd58e892dd6eedf82dbf37cfb938fc59cba05506e4ef2472fc240cfa" exitCode=0 Oct 01 10:41:33 crc kubenswrapper[4735]: I1001 10:41:33.018742 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" event={"ID":"ffacc50e-734b-4e8a-ac0c-a33197ce2351","Type":"ContainerDied","Data":"82b533cabd58e892dd6eedf82dbf37cfb938fc59cba05506e4ef2472fc240cfa"} Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.458802 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.583749 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7czz\" (UniqueName: \"kubernetes.io/projected/ffacc50e-734b-4e8a-ac0c-a33197ce2351-kube-api-access-n7czz\") pod \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.583976 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-bootstrap-combined-ca-bundle\") pod \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.584040 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-ssh-key\") pod \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.584795 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-inventory\") pod \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\" (UID: \"ffacc50e-734b-4e8a-ac0c-a33197ce2351\") " Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.589107 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ffacc50e-734b-4e8a-ac0c-a33197ce2351" (UID: "ffacc50e-734b-4e8a-ac0c-a33197ce2351"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.589273 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffacc50e-734b-4e8a-ac0c-a33197ce2351-kube-api-access-n7czz" (OuterVolumeSpecName: "kube-api-access-n7czz") pod "ffacc50e-734b-4e8a-ac0c-a33197ce2351" (UID: "ffacc50e-734b-4e8a-ac0c-a33197ce2351"). InnerVolumeSpecName "kube-api-access-n7czz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.616385 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-inventory" (OuterVolumeSpecName: "inventory") pod "ffacc50e-734b-4e8a-ac0c-a33197ce2351" (UID: "ffacc50e-734b-4e8a-ac0c-a33197ce2351"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.624006 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ffacc50e-734b-4e8a-ac0c-a33197ce2351" (UID: "ffacc50e-734b-4e8a-ac0c-a33197ce2351"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.687437 4735 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.687485 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.687519 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffacc50e-734b-4e8a-ac0c-a33197ce2351-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:41:34 crc kubenswrapper[4735]: I1001 10:41:34.687532 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7czz\" (UniqueName: \"kubernetes.io/projected/ffacc50e-734b-4e8a-ac0c-a33197ce2351-kube-api-access-n7czz\") on node \"crc\" DevicePath \"\"" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.039302 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" event={"ID":"ffacc50e-734b-4e8a-ac0c-a33197ce2351","Type":"ContainerDied","Data":"e3a0a091020227fd647e3b74dd2b91499ce73124f06e149b8cfda0aa021e629d"} Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.039338 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.039355 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3a0a091020227fd647e3b74dd2b91499ce73124f06e149b8cfda0aa021e629d" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.124434 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5"] Oct 01 10:41:35 crc kubenswrapper[4735]: E1001 10:41:35.124826 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" containerName="registry-server" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.124842 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" containerName="registry-server" Oct 01 10:41:35 crc kubenswrapper[4735]: E1001 10:41:35.124869 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" containerName="extract-content" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.124876 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" containerName="extract-content" Oct 01 10:41:35 crc kubenswrapper[4735]: E1001 10:41:35.124891 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" containerName="extract-utilities" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.124898 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" containerName="extract-utilities" Oct 01 10:41:35 crc kubenswrapper[4735]: E1001 10:41:35.124912 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffacc50e-734b-4e8a-ac0c-a33197ce2351" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.124918 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffacc50e-734b-4e8a-ac0c-a33197ce2351" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.125116 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffacc50e-734b-4e8a-ac0c-a33197ce2351" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.125135 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac02b0db-cf9e-467f-a4b5-0e8e2d0d0d3f" containerName="registry-server" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.125712 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.127660 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.127811 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.128842 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.128882 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.143105 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5"] Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.197143 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.197202 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thn5g\" (UniqueName: \"kubernetes.io/projected/31970fe6-5cef-41cc-8799-c4e9de559f23-kube-api-access-thn5g\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.197352 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.298705 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.299070 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thn5g\" (UniqueName: \"kubernetes.io/projected/31970fe6-5cef-41cc-8799-c4e9de559f23-kube-api-access-thn5g\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.299262 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.303680 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.304460 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.315332 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thn5g\" (UniqueName: \"kubernetes.io/projected/31970fe6-5cef-41cc-8799-c4e9de559f23-kube-api-access-thn5g\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.448103 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:41:35 crc kubenswrapper[4735]: I1001 10:41:35.992598 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5"] Oct 01 10:41:36 crc kubenswrapper[4735]: I1001 10:41:36.049631 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" event={"ID":"31970fe6-5cef-41cc-8799-c4e9de559f23","Type":"ContainerStarted","Data":"7460784d42e493c565b39a6a25d4f11bd08e5c302ec80f3bb81ff399f974738e"} Oct 01 10:41:37 crc kubenswrapper[4735]: I1001 10:41:37.072636 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" event={"ID":"31970fe6-5cef-41cc-8799-c4e9de559f23","Type":"ContainerStarted","Data":"be952252c41b9b14cccda1017dd44f722dc239783b9bdc5fb4db77c14010c739"} Oct 01 10:41:37 crc kubenswrapper[4735]: I1001 10:41:37.093677 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" podStartSLOduration=1.520729941 podStartE2EDuration="2.093650856s" podCreationTimestamp="2025-10-01 10:41:35 +0000 UTC" firstStartedPulling="2025-10-01 10:41:36.000307688 +0000 UTC m=+1454.693128960" lastFinishedPulling="2025-10-01 10:41:36.573228603 +0000 UTC m=+1455.266049875" observedRunningTime="2025-10-01 10:41:37.08894284 +0000 UTC m=+1455.781764112" watchObservedRunningTime="2025-10-01 10:41:37.093650856 +0000 UTC m=+1455.786472128" Oct 01 10:42:23 crc kubenswrapper[4735]: I1001 10:42:23.048406 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-4cn4h"] Oct 01 10:42:23 crc kubenswrapper[4735]: I1001 10:42:23.075771 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-sdsqt"] Oct 01 10:42:23 crc kubenswrapper[4735]: I1001 10:42:23.086609 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-sdsqt"] Oct 01 10:42:23 crc kubenswrapper[4735]: I1001 10:42:23.097634 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-4cn4h"] Oct 01 10:42:23 crc kubenswrapper[4735]: I1001 10:42:23.907902 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b59340-1ad8-48a8-8521-c4b72ea14720" path="/var/lib/kubelet/pods/17b59340-1ad8-48a8-8521-c4b72ea14720/volumes" Oct 01 10:42:23 crc kubenswrapper[4735]: I1001 10:42:23.908415 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b26f86c5-90f8-43cb-b728-bcdc6eaac262" path="/var/lib/kubelet/pods/b26f86c5-90f8-43cb-b728-bcdc6eaac262/volumes" Oct 01 10:42:25 crc kubenswrapper[4735]: I1001 10:42:25.198309 4735 scope.go:117] "RemoveContainer" containerID="b2c96490c70328115f0b86561f65ea61606b977bcc4ec9286815be1d297dcc5c" Oct 01 10:42:25 crc kubenswrapper[4735]: I1001 10:42:25.232673 4735 scope.go:117] "RemoveContainer" containerID="3ec3ef57eeb0651548c3ecea3aba543d978deb1903c04e2fc0a0fabaf0345c74" Oct 01 10:42:25 crc kubenswrapper[4735]: I1001 10:42:25.275371 4735 scope.go:117] "RemoveContainer" containerID="7ae56c1b4a1be80014ed2d8f1f33cf87a325e10dfc018ac6fce6cde9e673547d" Oct 01 10:42:25 crc kubenswrapper[4735]: I1001 10:42:25.295466 4735 scope.go:117] "RemoveContainer" containerID="3e8fb92337e9a80bca506949f8d810eb0d38642e2f367e9f28c432281bbbdba3" Oct 01 10:42:34 crc kubenswrapper[4735]: I1001 10:42:34.034456 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-rx5s2"] Oct 01 10:42:34 crc kubenswrapper[4735]: I1001 10:42:34.042450 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-fb4d-account-create-wt5pb"] Oct 01 10:42:34 crc kubenswrapper[4735]: I1001 10:42:34.051885 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4441-account-create-hgh8b"] Oct 01 10:42:34 crc kubenswrapper[4735]: I1001 10:42:34.060469 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-rx5s2"] Oct 01 10:42:34 crc kubenswrapper[4735]: I1001 10:42:34.067620 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-fb4d-account-create-wt5pb"] Oct 01 10:42:34 crc kubenswrapper[4735]: I1001 10:42:34.074894 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4441-account-create-hgh8b"] Oct 01 10:42:35 crc kubenswrapper[4735]: I1001 10:42:35.485168 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:42:35 crc kubenswrapper[4735]: I1001 10:42:35.485234 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:42:35 crc kubenswrapper[4735]: I1001 10:42:35.908527 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11967ebb-ac0f-4d46-adb0-b100ee29528b" path="/var/lib/kubelet/pods/11967ebb-ac0f-4d46-adb0-b100ee29528b/volumes" Oct 01 10:42:35 crc kubenswrapper[4735]: I1001 10:42:35.909265 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="254c94ca-4d51-4559-b88f-5363ff751d36" path="/var/lib/kubelet/pods/254c94ca-4d51-4559-b88f-5363ff751d36/volumes" Oct 01 10:42:35 crc kubenswrapper[4735]: I1001 10:42:35.909792 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6bf79da-946c-4805-9ba3-0b58e969b33a" path="/var/lib/kubelet/pods/d6bf79da-946c-4805-9ba3-0b58e969b33a/volumes" Oct 01 10:42:49 crc kubenswrapper[4735]: I1001 10:42:49.042374 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-73ae-account-create-n72rm"] Oct 01 10:42:49 crc kubenswrapper[4735]: I1001 10:42:49.053260 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-73ae-account-create-n72rm"] Oct 01 10:42:49 crc kubenswrapper[4735]: I1001 10:42:49.909124 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c13772-7115-4653-949d-ae1ef126bfd9" path="/var/lib/kubelet/pods/52c13772-7115-4653-949d-ae1ef126bfd9/volumes" Oct 01 10:42:52 crc kubenswrapper[4735]: I1001 10:42:52.027429 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-rfhkm"] Oct 01 10:42:52 crc kubenswrapper[4735]: I1001 10:42:52.034918 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-59bzl"] Oct 01 10:42:52 crc kubenswrapper[4735]: I1001 10:42:52.048049 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-qczl5"] Oct 01 10:42:52 crc kubenswrapper[4735]: I1001 10:42:52.055874 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-59bzl"] Oct 01 10:42:52 crc kubenswrapper[4735]: I1001 10:42:52.063103 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-rfhkm"] Oct 01 10:42:52 crc kubenswrapper[4735]: I1001 10:42:52.070344 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-qczl5"] Oct 01 10:42:53 crc kubenswrapper[4735]: I1001 10:42:53.918199 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27dc947b-08ef-428b-aa2c-9a59d002b8c3" path="/var/lib/kubelet/pods/27dc947b-08ef-428b-aa2c-9a59d002b8c3/volumes" Oct 01 10:42:53 crc kubenswrapper[4735]: I1001 10:42:53.919902 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1cf88a-d884-4bdb-bb43-cc0a8f84da82" path="/var/lib/kubelet/pods/3d1cf88a-d884-4bdb-bb43-cc0a8f84da82/volumes" Oct 01 10:42:53 crc kubenswrapper[4735]: I1001 10:42:53.921203 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f855bd65-1b1b-40e5-98b1-8772d7cb3c8b" path="/var/lib/kubelet/pods/f855bd65-1b1b-40e5-98b1-8772d7cb3c8b/volumes" Oct 01 10:42:59 crc kubenswrapper[4735]: I1001 10:42:59.027983 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-qksn5"] Oct 01 10:42:59 crc kubenswrapper[4735]: I1001 10:42:59.034850 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-qksn5"] Oct 01 10:42:59 crc kubenswrapper[4735]: I1001 10:42:59.910263 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eba15328-be82-4e04-a4bf-8097322615de" path="/var/lib/kubelet/pods/eba15328-be82-4e04-a4bf-8097322615de/volumes" Oct 01 10:43:01 crc kubenswrapper[4735]: I1001 10:43:01.024994 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-mkgqx"] Oct 01 10:43:01 crc kubenswrapper[4735]: I1001 10:43:01.031960 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-mkgqx"] Oct 01 10:43:01 crc kubenswrapper[4735]: I1001 10:43:01.914927 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a7ca219-5b36-4053-bc9c-92e73c7ad509" path="/var/lib/kubelet/pods/9a7ca219-5b36-4053-bc9c-92e73c7ad509/volumes" Oct 01 10:43:05 crc kubenswrapper[4735]: I1001 10:43:05.485775 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:43:05 crc kubenswrapper[4735]: I1001 10:43:05.486187 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:43:05 crc kubenswrapper[4735]: I1001 10:43:05.918306 4735 generic.go:334] "Generic (PLEG): container finished" podID="31970fe6-5cef-41cc-8799-c4e9de559f23" containerID="be952252c41b9b14cccda1017dd44f722dc239783b9bdc5fb4db77c14010c739" exitCode=0 Oct 01 10:43:05 crc kubenswrapper[4735]: I1001 10:43:05.918344 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" event={"ID":"31970fe6-5cef-41cc-8799-c4e9de559f23","Type":"ContainerDied","Data":"be952252c41b9b14cccda1017dd44f722dc239783b9bdc5fb4db77c14010c739"} Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.340442 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.460458 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thn5g\" (UniqueName: \"kubernetes.io/projected/31970fe6-5cef-41cc-8799-c4e9de559f23-kube-api-access-thn5g\") pod \"31970fe6-5cef-41cc-8799-c4e9de559f23\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.460567 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-inventory\") pod \"31970fe6-5cef-41cc-8799-c4e9de559f23\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.460804 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-ssh-key\") pod \"31970fe6-5cef-41cc-8799-c4e9de559f23\" (UID: \"31970fe6-5cef-41cc-8799-c4e9de559f23\") " Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.465414 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31970fe6-5cef-41cc-8799-c4e9de559f23-kube-api-access-thn5g" (OuterVolumeSpecName: "kube-api-access-thn5g") pod "31970fe6-5cef-41cc-8799-c4e9de559f23" (UID: "31970fe6-5cef-41cc-8799-c4e9de559f23"). InnerVolumeSpecName "kube-api-access-thn5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.486962 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "31970fe6-5cef-41cc-8799-c4e9de559f23" (UID: "31970fe6-5cef-41cc-8799-c4e9de559f23"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.492150 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-inventory" (OuterVolumeSpecName: "inventory") pod "31970fe6-5cef-41cc-8799-c4e9de559f23" (UID: "31970fe6-5cef-41cc-8799-c4e9de559f23"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.563684 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thn5g\" (UniqueName: \"kubernetes.io/projected/31970fe6-5cef-41cc-8799-c4e9de559f23-kube-api-access-thn5g\") on node \"crc\" DevicePath \"\"" Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.563714 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.563723 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31970fe6-5cef-41cc-8799-c4e9de559f23-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.939444 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" event={"ID":"31970fe6-5cef-41cc-8799-c4e9de559f23","Type":"ContainerDied","Data":"7460784d42e493c565b39a6a25d4f11bd08e5c302ec80f3bb81ff399f974738e"} Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.939481 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7460784d42e493c565b39a6a25d4f11bd08e5c302ec80f3bb81ff399f974738e" Oct 01 10:43:07 crc kubenswrapper[4735]: I1001 10:43:07.939682 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.011066 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b"] Oct 01 10:43:08 crc kubenswrapper[4735]: E1001 10:43:08.011821 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31970fe6-5cef-41cc-8799-c4e9de559f23" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.011924 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="31970fe6-5cef-41cc-8799-c4e9de559f23" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.012344 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="31970fe6-5cef-41cc-8799-c4e9de559f23" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.013268 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.019418 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.019454 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.019561 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.019581 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.024017 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b"] Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.072730 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wccpb\" (UniqueName: \"kubernetes.io/projected/697ab5a4-56ef-4755-9211-fcd52866c939-kube-api-access-wccpb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.072830 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.072926 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.175288 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wccpb\" (UniqueName: \"kubernetes.io/projected/697ab5a4-56ef-4755-9211-fcd52866c939-kube-api-access-wccpb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.175459 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.175603 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.180166 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.180256 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.190432 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wccpb\" (UniqueName: \"kubernetes.io/projected/697ab5a4-56ef-4755-9211-fcd52866c939-kube-api-access-wccpb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.335097 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.879834 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b"] Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.888088 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 10:43:08 crc kubenswrapper[4735]: I1001 10:43:08.951589 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" event={"ID":"697ab5a4-56ef-4755-9211-fcd52866c939","Type":"ContainerStarted","Data":"86c0ee5c3018a882c2814a93597ffeed5fc92d767267fbcf2bb6e1f6b33f7a1c"} Oct 01 10:43:09 crc kubenswrapper[4735]: I1001 10:43:09.962753 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" event={"ID":"697ab5a4-56ef-4755-9211-fcd52866c939","Type":"ContainerStarted","Data":"672933b359729aa029e5eaebf980f16acf4961796e388b7b571dba87898fddd6"} Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.050886 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" podStartSLOduration=17.490128809 podStartE2EDuration="18.050861151s" podCreationTimestamp="2025-10-01 10:43:07 +0000 UTC" firstStartedPulling="2025-10-01 10:43:08.887798747 +0000 UTC m=+1547.580620009" lastFinishedPulling="2025-10-01 10:43:09.448531049 +0000 UTC m=+1548.141352351" observedRunningTime="2025-10-01 10:43:09.981731521 +0000 UTC m=+1548.674552803" watchObservedRunningTime="2025-10-01 10:43:25.050861151 +0000 UTC m=+1563.743682433" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.059402 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4552-account-create-25f9d"] Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.073593 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d6e0-account-create-2pwpg"] Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.083463 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e0db-account-create-d2r8g"] Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.091416 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e0db-account-create-d2r8g"] Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.097723 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4552-account-create-25f9d"] Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.105730 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d6e0-account-create-2pwpg"] Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.383852 4735 scope.go:117] "RemoveContainer" containerID="03b034a160c6e1a708e4ca2d12ea746acd6a4333778b399011ed858edc88fed5" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.425392 4735 scope.go:117] "RemoveContainer" containerID="98568aad39b8f4d7cc86870dd3a4b1084add1343cd024e8f4ad7cb41052875b3" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.459932 4735 scope.go:117] "RemoveContainer" containerID="3cbe5988d296450eab37bf2ab84072dadfdb357254fe7e47cc8b463b684e6d52" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.507362 4735 scope.go:117] "RemoveContainer" containerID="be5d117abefe9e28d4ed58fc95f5ace4de3a7aa6e3500377022e8b00f3f68d0d" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.538164 4735 scope.go:117] "RemoveContainer" containerID="bac73fe606dbd822a876584978dd75f4b2c539a1dc1635d3cd614c2a4c92a23f" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.578915 4735 scope.go:117] "RemoveContainer" containerID="b2a34c63e12d94aa2b41403d524670e670a6eca614006600e38b5d5cc5c464c9" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.615313 4735 scope.go:117] "RemoveContainer" containerID="26d0fd0f6eefd8bf55f006dee6bc13552f42c805ed7016a1dd04c3e7b62ab7a0" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.646822 4735 scope.go:117] "RemoveContainer" containerID="a2be7bfff3648b55a6637e07458d8410ce6fda444ff880112fe2736e341480d5" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.679690 4735 scope.go:117] "RemoveContainer" containerID="0e00181e8119afb1f18c8b915d6795d31a721f53898fbdbef34cd4833c6e1055" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.908829 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f45b80-8249-470b-b45c-ecca38477609" path="/var/lib/kubelet/pods/19f45b80-8249-470b-b45c-ecca38477609/volumes" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.910050 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="314d523b-a835-4be5-b964-108fbc40db3a" path="/var/lib/kubelet/pods/314d523b-a835-4be5-b964-108fbc40db3a/volumes" Oct 01 10:43:25 crc kubenswrapper[4735]: I1001 10:43:25.911343 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c97b49f0-ca3a-4433-b3c8-549fef88bfc1" path="/var/lib/kubelet/pods/c97b49f0-ca3a-4433-b3c8-549fef88bfc1/volumes" Oct 01 10:43:27 crc kubenswrapper[4735]: I1001 10:43:27.032944 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xqbbf"] Oct 01 10:43:27 crc kubenswrapper[4735]: I1001 10:43:27.045086 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xqbbf"] Oct 01 10:43:27 crc kubenswrapper[4735]: I1001 10:43:27.913749 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c176cb22-cf84-4a55-aac6-6fff6792ea56" path="/var/lib/kubelet/pods/c176cb22-cf84-4a55-aac6-6fff6792ea56/volumes" Oct 01 10:43:32 crc kubenswrapper[4735]: I1001 10:43:32.028273 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-t8d7j"] Oct 01 10:43:32 crc kubenswrapper[4735]: I1001 10:43:32.037686 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-t8d7j"] Oct 01 10:43:33 crc kubenswrapper[4735]: I1001 10:43:33.912376 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7cb175b-c967-47c2-96b0-da043b6d3506" path="/var/lib/kubelet/pods/c7cb175b-c967-47c2-96b0-da043b6d3506/volumes" Oct 01 10:43:35 crc kubenswrapper[4735]: I1001 10:43:35.485629 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:43:35 crc kubenswrapper[4735]: I1001 10:43:35.485988 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:43:35 crc kubenswrapper[4735]: I1001 10:43:35.486052 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:43:35 crc kubenswrapper[4735]: I1001 10:43:35.487084 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 10:43:35 crc kubenswrapper[4735]: I1001 10:43:35.487187 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" gracePeriod=600 Oct 01 10:43:35 crc kubenswrapper[4735]: E1001 10:43:35.612022 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:43:36 crc kubenswrapper[4735]: I1001 10:43:36.227201 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" exitCode=0 Oct 01 10:43:36 crc kubenswrapper[4735]: I1001 10:43:36.227338 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba"} Oct 01 10:43:36 crc kubenswrapper[4735]: I1001 10:43:36.227802 4735 scope.go:117] "RemoveContainer" containerID="32fd47d6848f87a0586764661e897b70eae54a447c3515a2ff59f02148ba9b6a" Oct 01 10:43:36 crc kubenswrapper[4735]: I1001 10:43:36.229148 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:43:36 crc kubenswrapper[4735]: E1001 10:43:36.229571 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:43:44 crc kubenswrapper[4735]: I1001 10:43:44.046937 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-grznc"] Oct 01 10:43:44 crc kubenswrapper[4735]: I1001 10:43:44.057061 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-grznc"] Oct 01 10:43:45 crc kubenswrapper[4735]: I1001 10:43:45.913250 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6" path="/var/lib/kubelet/pods/e3d7f4c0-a8ed-4dd2-a25e-c83937ae19a6/volumes" Oct 01 10:43:48 crc kubenswrapper[4735]: I1001 10:43:48.897633 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:43:48 crc kubenswrapper[4735]: E1001 10:43:48.898292 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:44:03 crc kubenswrapper[4735]: I1001 10:44:03.896534 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:44:03 crc kubenswrapper[4735]: E1001 10:44:03.897254 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:44:14 crc kubenswrapper[4735]: I1001 10:44:14.898033 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:44:14 crc kubenswrapper[4735]: E1001 10:44:14.899074 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:44:15 crc kubenswrapper[4735]: I1001 10:44:15.611639 4735 generic.go:334] "Generic (PLEG): container finished" podID="697ab5a4-56ef-4755-9211-fcd52866c939" containerID="672933b359729aa029e5eaebf980f16acf4961796e388b7b571dba87898fddd6" exitCode=0 Oct 01 10:44:15 crc kubenswrapper[4735]: I1001 10:44:15.611696 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" event={"ID":"697ab5a4-56ef-4755-9211-fcd52866c939","Type":"ContainerDied","Data":"672933b359729aa029e5eaebf980f16acf4961796e388b7b571dba87898fddd6"} Oct 01 10:44:16 crc kubenswrapper[4735]: I1001 10:44:16.997713 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.058181 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-lcc6x"] Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.067947 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-lcc6x"] Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.156664 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-ssh-key\") pod \"697ab5a4-56ef-4755-9211-fcd52866c939\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.157068 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-inventory\") pod \"697ab5a4-56ef-4755-9211-fcd52866c939\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.157316 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wccpb\" (UniqueName: \"kubernetes.io/projected/697ab5a4-56ef-4755-9211-fcd52866c939-kube-api-access-wccpb\") pod \"697ab5a4-56ef-4755-9211-fcd52866c939\" (UID: \"697ab5a4-56ef-4755-9211-fcd52866c939\") " Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.164118 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/697ab5a4-56ef-4755-9211-fcd52866c939-kube-api-access-wccpb" (OuterVolumeSpecName: "kube-api-access-wccpb") pod "697ab5a4-56ef-4755-9211-fcd52866c939" (UID: "697ab5a4-56ef-4755-9211-fcd52866c939"). InnerVolumeSpecName "kube-api-access-wccpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.183206 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-inventory" (OuterVolumeSpecName: "inventory") pod "697ab5a4-56ef-4755-9211-fcd52866c939" (UID: "697ab5a4-56ef-4755-9211-fcd52866c939"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.190460 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "697ab5a4-56ef-4755-9211-fcd52866c939" (UID: "697ab5a4-56ef-4755-9211-fcd52866c939"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.258797 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.258823 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/697ab5a4-56ef-4755-9211-fcd52866c939-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.258835 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wccpb\" (UniqueName: \"kubernetes.io/projected/697ab5a4-56ef-4755-9211-fcd52866c939-kube-api-access-wccpb\") on node \"crc\" DevicePath \"\"" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.631836 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" event={"ID":"697ab5a4-56ef-4755-9211-fcd52866c939","Type":"ContainerDied","Data":"86c0ee5c3018a882c2814a93597ffeed5fc92d767267fbcf2bb6e1f6b33f7a1c"} Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.631882 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86c0ee5c3018a882c2814a93597ffeed5fc92d767267fbcf2bb6e1f6b33f7a1c" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.631922 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.716174 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9"] Oct 01 10:44:17 crc kubenswrapper[4735]: E1001 10:44:17.716867 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="697ab5a4-56ef-4755-9211-fcd52866c939" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.716970 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="697ab5a4-56ef-4755-9211-fcd52866c939" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.717293 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="697ab5a4-56ef-4755-9211-fcd52866c939" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.720088 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.723180 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.723613 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.723810 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.726862 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.736067 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9"] Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.870017 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.871129 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.871426 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7s7v\" (UniqueName: \"kubernetes.io/projected/02bd9618-e194-4d1b-98f5-90ab53e53e39-kube-api-access-m7s7v\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.912916 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6039fc3-1cee-4902-81bd-0f35cf2eaa96" path="/var/lib/kubelet/pods/f6039fc3-1cee-4902-81bd-0f35cf2eaa96/volumes" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.973956 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.974615 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.974844 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7s7v\" (UniqueName: \"kubernetes.io/projected/02bd9618-e194-4d1b-98f5-90ab53e53e39-kube-api-access-m7s7v\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.978755 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.979711 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:17 crc kubenswrapper[4735]: I1001 10:44:17.998792 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7s7v\" (UniqueName: \"kubernetes.io/projected/02bd9618-e194-4d1b-98f5-90ab53e53e39-kube-api-access-m7s7v\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:18 crc kubenswrapper[4735]: I1001 10:44:18.030253 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zjsng"] Oct 01 10:44:18 crc kubenswrapper[4735]: I1001 10:44:18.042078 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5nzjx"] Oct 01 10:44:18 crc kubenswrapper[4735]: I1001 10:44:18.046533 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:18 crc kubenswrapper[4735]: I1001 10:44:18.056172 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zjsng"] Oct 01 10:44:18 crc kubenswrapper[4735]: I1001 10:44:18.068574 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5nzjx"] Oct 01 10:44:18 crc kubenswrapper[4735]: I1001 10:44:18.540000 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9"] Oct 01 10:44:18 crc kubenswrapper[4735]: I1001 10:44:18.640850 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" event={"ID":"02bd9618-e194-4d1b-98f5-90ab53e53e39","Type":"ContainerStarted","Data":"877fa35e1146c59527ecbe55381ad542113bba66e16ddc3ede188c7c0ca09577"} Oct 01 10:44:19 crc kubenswrapper[4735]: I1001 10:44:19.655860 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" event={"ID":"02bd9618-e194-4d1b-98f5-90ab53e53e39","Type":"ContainerStarted","Data":"a62fafc6889d7a94c581b5f7c28e8978bf2e54fa015203bdda89042c920cca66"} Oct 01 10:44:19 crc kubenswrapper[4735]: I1001 10:44:19.670958 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" podStartSLOduration=2.006068372 podStartE2EDuration="2.670940016s" podCreationTimestamp="2025-10-01 10:44:17 +0000 UTC" firstStartedPulling="2025-10-01 10:44:18.557781433 +0000 UTC m=+1617.250602695" lastFinishedPulling="2025-10-01 10:44:19.222653077 +0000 UTC m=+1617.915474339" observedRunningTime="2025-10-01 10:44:19.669077127 +0000 UTC m=+1618.361898409" watchObservedRunningTime="2025-10-01 10:44:19.670940016 +0000 UTC m=+1618.363761298" Oct 01 10:44:19 crc kubenswrapper[4735]: I1001 10:44:19.911190 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aac33e2-e90f-4db8-95a4-b676416a6781" path="/var/lib/kubelet/pods/3aac33e2-e90f-4db8-95a4-b676416a6781/volumes" Oct 01 10:44:19 crc kubenswrapper[4735]: I1001 10:44:19.911791 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8163cb1d-c1d1-48cd-8e45-6f239a7095c1" path="/var/lib/kubelet/pods/8163cb1d-c1d1-48cd-8e45-6f239a7095c1/volumes" Oct 01 10:44:24 crc kubenswrapper[4735]: I1001 10:44:24.025930 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-8cvdn"] Oct 01 10:44:24 crc kubenswrapper[4735]: I1001 10:44:24.033603 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-8cvdn"] Oct 01 10:44:24 crc kubenswrapper[4735]: I1001 10:44:24.715636 4735 generic.go:334] "Generic (PLEG): container finished" podID="02bd9618-e194-4d1b-98f5-90ab53e53e39" containerID="a62fafc6889d7a94c581b5f7c28e8978bf2e54fa015203bdda89042c920cca66" exitCode=0 Oct 01 10:44:24 crc kubenswrapper[4735]: I1001 10:44:24.715759 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" event={"ID":"02bd9618-e194-4d1b-98f5-90ab53e53e39","Type":"ContainerDied","Data":"a62fafc6889d7a94c581b5f7c28e8978bf2e54fa015203bdda89042c920cca66"} Oct 01 10:44:25 crc kubenswrapper[4735]: I1001 10:44:25.857392 4735 scope.go:117] "RemoveContainer" containerID="88fc78d2940c2fb975d246994510ce591ee999be68ce8220841a5eb38973e6fe" Oct 01 10:44:25 crc kubenswrapper[4735]: I1001 10:44:25.891467 4735 scope.go:117] "RemoveContainer" containerID="beb2507f6d4c56e9075de1d8ca34ff241ef528a92894be152b5b153d79b01c12" Oct 01 10:44:25 crc kubenswrapper[4735]: I1001 10:44:25.925936 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541784dc-4146-459d-bee0-2f97d22a7977" path="/var/lib/kubelet/pods/541784dc-4146-459d-bee0-2f97d22a7977/volumes" Oct 01 10:44:25 crc kubenswrapper[4735]: I1001 10:44:25.960448 4735 scope.go:117] "RemoveContainer" containerID="a97369da3fa95662846cba8da6be9a0d38d7dd64fd9aa8205804c9c4faa7904f" Oct 01 10:44:25 crc kubenswrapper[4735]: I1001 10:44:25.992066 4735 scope.go:117] "RemoveContainer" containerID="5688c7fc60494353be2b6ebfcefc98759a71666e60aa632e6319fbe57301999d" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.097725 4735 scope.go:117] "RemoveContainer" containerID="19eea9f23b739e7a0e61f0315d3d8dc06408a21cadf84144922df57633f716f0" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.113634 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.157356 4735 scope.go:117] "RemoveContainer" containerID="55117a38017e670a1b2932fdcb18de07cf6303845e8d97559e98dcb0242b0369" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.186308 4735 scope.go:117] "RemoveContainer" containerID="c3ceca45415b38d092d174cc57e27229ae5db3eee3de8baeae4b56e8e5c00c68" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.209481 4735 scope.go:117] "RemoveContainer" containerID="871ba6435ca82ef3378113758f71d32f1bf5a06ff46ed614a610e83dca5a4e19" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.222910 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-ssh-key\") pod \"02bd9618-e194-4d1b-98f5-90ab53e53e39\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.222996 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-inventory\") pod \"02bd9618-e194-4d1b-98f5-90ab53e53e39\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.223219 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7s7v\" (UniqueName: \"kubernetes.io/projected/02bd9618-e194-4d1b-98f5-90ab53e53e39-kube-api-access-m7s7v\") pod \"02bd9618-e194-4d1b-98f5-90ab53e53e39\" (UID: \"02bd9618-e194-4d1b-98f5-90ab53e53e39\") " Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.229842 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bd9618-e194-4d1b-98f5-90ab53e53e39-kube-api-access-m7s7v" (OuterVolumeSpecName: "kube-api-access-m7s7v") pod "02bd9618-e194-4d1b-98f5-90ab53e53e39" (UID: "02bd9618-e194-4d1b-98f5-90ab53e53e39"). InnerVolumeSpecName "kube-api-access-m7s7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.233676 4735 scope.go:117] "RemoveContainer" containerID="9e576938ab80f23960b56223b285b3cd0eb4e29059a1804a453a3467282b53cf" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.255537 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "02bd9618-e194-4d1b-98f5-90ab53e53e39" (UID: "02bd9618-e194-4d1b-98f5-90ab53e53e39"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.256755 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-inventory" (OuterVolumeSpecName: "inventory") pod "02bd9618-e194-4d1b-98f5-90ab53e53e39" (UID: "02bd9618-e194-4d1b-98f5-90ab53e53e39"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.276774 4735 scope.go:117] "RemoveContainer" containerID="f24ed144ef04ba958cbee6c962d667d0f9ea158eb7e889438e1e365107c66e21" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.325526 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7s7v\" (UniqueName: \"kubernetes.io/projected/02bd9618-e194-4d1b-98f5-90ab53e53e39-kube-api-access-m7s7v\") on node \"crc\" DevicePath \"\"" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.325555 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.325567 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02bd9618-e194-4d1b-98f5-90ab53e53e39-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.737513 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" event={"ID":"02bd9618-e194-4d1b-98f5-90ab53e53e39","Type":"ContainerDied","Data":"877fa35e1146c59527ecbe55381ad542113bba66e16ddc3ede188c7c0ca09577"} Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.737566 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="877fa35e1146c59527ecbe55381ad542113bba66e16ddc3ede188c7c0ca09577" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.737586 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.804131 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs"] Oct 01 10:44:26 crc kubenswrapper[4735]: E1001 10:44:26.804680 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bd9618-e194-4d1b-98f5-90ab53e53e39" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.804707 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bd9618-e194-4d1b-98f5-90ab53e53e39" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.804925 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="02bd9618-e194-4d1b-98f5-90ab53e53e39" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.806034 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.810698 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.810722 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.810767 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.827569 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.835579 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs"] Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.938442 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cd2rs\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.938514 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cd2rs\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:44:26 crc kubenswrapper[4735]: I1001 10:44:26.938684 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw5gf\" (UniqueName: \"kubernetes.io/projected/569a20cc-087a-4c93-b23b-af5c6b209b80-kube-api-access-dw5gf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cd2rs\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.025338 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ea66-account-create-hsfw4"] Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.033280 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3a5f-account-create-2knz5"] Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.040378 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw5gf\" (UniqueName: \"kubernetes.io/projected/569a20cc-087a-4c93-b23b-af5c6b209b80-kube-api-access-dw5gf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cd2rs\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.040626 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cd2rs\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.040676 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cd2rs\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.041192 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ea66-account-create-hsfw4"] Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.046189 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cd2rs\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.054553 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cdcc-account-create-5vjp4"] Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.061552 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3a5f-account-create-2knz5"] Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.067144 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cdcc-account-create-5vjp4"] Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.067149 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cd2rs\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.093780 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw5gf\" (UniqueName: \"kubernetes.io/projected/569a20cc-087a-4c93-b23b-af5c6b209b80-kube-api-access-dw5gf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-cd2rs\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.139930 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.662651 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs"] Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.744903 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" event={"ID":"569a20cc-087a-4c93-b23b-af5c6b209b80","Type":"ContainerStarted","Data":"85bad8139839fe215f4ff66222f5f748c338f0b7c32454922c833d1ba66f2389"} Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.898080 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:44:27 crc kubenswrapper[4735]: E1001 10:44:27.898427 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.909185 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d783e4-455a-4e50-a5a6-ca1a289b2ad7" path="/var/lib/kubelet/pods/01d783e4-455a-4e50-a5a6-ca1a289b2ad7/volumes" Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.909879 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1489d8-cd50-4c5f-b6f8-22854e49b246" path="/var/lib/kubelet/pods/5f1489d8-cd50-4c5f-b6f8-22854e49b246/volumes" Oct 01 10:44:27 crc kubenswrapper[4735]: I1001 10:44:27.910377 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7258162b-c9ac-49b4-a0ef-ec5f19491efa" path="/var/lib/kubelet/pods/7258162b-c9ac-49b4-a0ef-ec5f19491efa/volumes" Oct 01 10:44:28 crc kubenswrapper[4735]: I1001 10:44:28.754416 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" event={"ID":"569a20cc-087a-4c93-b23b-af5c6b209b80","Type":"ContainerStarted","Data":"3664ed83d84bfd4c7e84242ca04fafb48e4226f081f5951b3a171723bd1f52c8"} Oct 01 10:44:28 crc kubenswrapper[4735]: I1001 10:44:28.775393 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" podStartSLOduration=2.342294543 podStartE2EDuration="2.775372659s" podCreationTimestamp="2025-10-01 10:44:26 +0000 UTC" firstStartedPulling="2025-10-01 10:44:27.668897813 +0000 UTC m=+1626.361719075" lastFinishedPulling="2025-10-01 10:44:28.101975909 +0000 UTC m=+1626.794797191" observedRunningTime="2025-10-01 10:44:28.768380963 +0000 UTC m=+1627.461202225" watchObservedRunningTime="2025-10-01 10:44:28.775372659 +0000 UTC m=+1627.468193931" Oct 01 10:44:42 crc kubenswrapper[4735]: I1001 10:44:42.897379 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:44:42 crc kubenswrapper[4735]: E1001 10:44:42.898280 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:44:56 crc kubenswrapper[4735]: I1001 10:44:56.897840 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:44:56 crc kubenswrapper[4735]: E1001 10:44:56.898771 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.150631 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8"] Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.152398 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.155441 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.156448 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.170431 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8"] Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.306164 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-secret-volume\") pod \"collect-profiles-29321925-4jxq8\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.306607 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-config-volume\") pod \"collect-profiles-29321925-4jxq8\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.307133 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jqhr\" (UniqueName: \"kubernetes.io/projected/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-kube-api-access-8jqhr\") pod \"collect-profiles-29321925-4jxq8\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.409056 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-secret-volume\") pod \"collect-profiles-29321925-4jxq8\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.409807 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-config-volume\") pod \"collect-profiles-29321925-4jxq8\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.410028 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jqhr\" (UniqueName: \"kubernetes.io/projected/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-kube-api-access-8jqhr\") pod \"collect-profiles-29321925-4jxq8\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.411241 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-config-volume\") pod \"collect-profiles-29321925-4jxq8\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.419348 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-secret-volume\") pod \"collect-profiles-29321925-4jxq8\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.442884 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jqhr\" (UniqueName: \"kubernetes.io/projected/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-kube-api-access-8jqhr\") pod \"collect-profiles-29321925-4jxq8\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.473418 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:00 crc kubenswrapper[4735]: I1001 10:45:00.760623 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8"] Oct 01 10:45:01 crc kubenswrapper[4735]: I1001 10:45:01.060098 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" event={"ID":"4a0990f1-3497-4f2a-9fb0-05ae4c17e068","Type":"ContainerStarted","Data":"7f0b352938fe58808f3d92eccb37b164f99c10b5db27fb7e38546c99185400d5"} Oct 01 10:45:03 crc kubenswrapper[4735]: I1001 10:45:03.085731 4735 generic.go:334] "Generic (PLEG): container finished" podID="4a0990f1-3497-4f2a-9fb0-05ae4c17e068" containerID="8302a1d47cc9885cf816c947897b364e95ab1e8887027d9c73f345f8159075d4" exitCode=0 Oct 01 10:45:03 crc kubenswrapper[4735]: I1001 10:45:03.085838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" event={"ID":"4a0990f1-3497-4f2a-9fb0-05ae4c17e068","Type":"ContainerDied","Data":"8302a1d47cc9885cf816c947897b364e95ab1e8887027d9c73f345f8159075d4"} Oct 01 10:45:04 crc kubenswrapper[4735]: I1001 10:45:04.518484 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:04 crc kubenswrapper[4735]: I1001 10:45:04.600125 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-secret-volume\") pod \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " Oct 01 10:45:04 crc kubenswrapper[4735]: I1001 10:45:04.600259 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-config-volume\") pod \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " Oct 01 10:45:04 crc kubenswrapper[4735]: I1001 10:45:04.600373 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jqhr\" (UniqueName: \"kubernetes.io/projected/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-kube-api-access-8jqhr\") pod \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\" (UID: \"4a0990f1-3497-4f2a-9fb0-05ae4c17e068\") " Oct 01 10:45:04 crc kubenswrapper[4735]: I1001 10:45:04.601057 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-config-volume" (OuterVolumeSpecName: "config-volume") pod "4a0990f1-3497-4f2a-9fb0-05ae4c17e068" (UID: "4a0990f1-3497-4f2a-9fb0-05ae4c17e068"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:45:04 crc kubenswrapper[4735]: I1001 10:45:04.605584 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-kube-api-access-8jqhr" (OuterVolumeSpecName: "kube-api-access-8jqhr") pod "4a0990f1-3497-4f2a-9fb0-05ae4c17e068" (UID: "4a0990f1-3497-4f2a-9fb0-05ae4c17e068"). InnerVolumeSpecName "kube-api-access-8jqhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:45:04 crc kubenswrapper[4735]: I1001 10:45:04.607143 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4a0990f1-3497-4f2a-9fb0-05ae4c17e068" (UID: "4a0990f1-3497-4f2a-9fb0-05ae4c17e068"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:45:04 crc kubenswrapper[4735]: I1001 10:45:04.703204 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 10:45:04 crc kubenswrapper[4735]: I1001 10:45:04.703257 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 10:45:04 crc kubenswrapper[4735]: I1001 10:45:04.703279 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jqhr\" (UniqueName: \"kubernetes.io/projected/4a0990f1-3497-4f2a-9fb0-05ae4c17e068-kube-api-access-8jqhr\") on node \"crc\" DevicePath \"\"" Oct 01 10:45:05 crc kubenswrapper[4735]: I1001 10:45:05.114801 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" event={"ID":"4a0990f1-3497-4f2a-9fb0-05ae4c17e068","Type":"ContainerDied","Data":"7f0b352938fe58808f3d92eccb37b164f99c10b5db27fb7e38546c99185400d5"} Oct 01 10:45:05 crc kubenswrapper[4735]: I1001 10:45:05.115088 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f0b352938fe58808f3d92eccb37b164f99c10b5db27fb7e38546c99185400d5" Oct 01 10:45:05 crc kubenswrapper[4735]: I1001 10:45:05.114911 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321925-4jxq8" Oct 01 10:45:06 crc kubenswrapper[4735]: I1001 10:45:06.128062 4735 generic.go:334] "Generic (PLEG): container finished" podID="569a20cc-087a-4c93-b23b-af5c6b209b80" containerID="3664ed83d84bfd4c7e84242ca04fafb48e4226f081f5951b3a171723bd1f52c8" exitCode=0 Oct 01 10:45:06 crc kubenswrapper[4735]: I1001 10:45:06.128123 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" event={"ID":"569a20cc-087a-4c93-b23b-af5c6b209b80","Type":"ContainerDied","Data":"3664ed83d84bfd4c7e84242ca04fafb48e4226f081f5951b3a171723bd1f52c8"} Oct 01 10:45:07 crc kubenswrapper[4735]: I1001 10:45:07.585227 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:45:07 crc kubenswrapper[4735]: I1001 10:45:07.669811 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-ssh-key\") pod \"569a20cc-087a-4c93-b23b-af5c6b209b80\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " Oct 01 10:45:07 crc kubenswrapper[4735]: I1001 10:45:07.669979 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-inventory\") pod \"569a20cc-087a-4c93-b23b-af5c6b209b80\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " Oct 01 10:45:07 crc kubenswrapper[4735]: I1001 10:45:07.670083 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw5gf\" (UniqueName: \"kubernetes.io/projected/569a20cc-087a-4c93-b23b-af5c6b209b80-kube-api-access-dw5gf\") pod \"569a20cc-087a-4c93-b23b-af5c6b209b80\" (UID: \"569a20cc-087a-4c93-b23b-af5c6b209b80\") " Oct 01 10:45:07 crc kubenswrapper[4735]: I1001 10:45:07.676111 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/569a20cc-087a-4c93-b23b-af5c6b209b80-kube-api-access-dw5gf" (OuterVolumeSpecName: "kube-api-access-dw5gf") pod "569a20cc-087a-4c93-b23b-af5c6b209b80" (UID: "569a20cc-087a-4c93-b23b-af5c6b209b80"). InnerVolumeSpecName "kube-api-access-dw5gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:45:07 crc kubenswrapper[4735]: I1001 10:45:07.702942 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-inventory" (OuterVolumeSpecName: "inventory") pod "569a20cc-087a-4c93-b23b-af5c6b209b80" (UID: "569a20cc-087a-4c93-b23b-af5c6b209b80"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:45:07 crc kubenswrapper[4735]: I1001 10:45:07.707680 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "569a20cc-087a-4c93-b23b-af5c6b209b80" (UID: "569a20cc-087a-4c93-b23b-af5c6b209b80"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:45:07 crc kubenswrapper[4735]: I1001 10:45:07.771918 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:45:07 crc kubenswrapper[4735]: I1001 10:45:07.771951 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/569a20cc-087a-4c93-b23b-af5c6b209b80-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:45:07 crc kubenswrapper[4735]: I1001 10:45:07.771962 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw5gf\" (UniqueName: \"kubernetes.io/projected/569a20cc-087a-4c93-b23b-af5c6b209b80-kube-api-access-dw5gf\") on node \"crc\" DevicePath \"\"" Oct 01 10:45:07 crc kubenswrapper[4735]: I1001 10:45:07.897028 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:45:07 crc kubenswrapper[4735]: E1001 10:45:07.897355 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.045638 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-84vvz"] Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.062128 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-84vvz"] Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.148089 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" event={"ID":"569a20cc-087a-4c93-b23b-af5c6b209b80","Type":"ContainerDied","Data":"85bad8139839fe215f4ff66222f5f748c338f0b7c32454922c833d1ba66f2389"} Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.148125 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85bad8139839fe215f4ff66222f5f748c338f0b7c32454922c833d1ba66f2389" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.148203 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-cd2rs" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.242004 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq"] Oct 01 10:45:08 crc kubenswrapper[4735]: E1001 10:45:08.242550 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="569a20cc-087a-4c93-b23b-af5c6b209b80" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.242576 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="569a20cc-087a-4c93-b23b-af5c6b209b80" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:45:08 crc kubenswrapper[4735]: E1001 10:45:08.242606 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a0990f1-3497-4f2a-9fb0-05ae4c17e068" containerName="collect-profiles" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.242618 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a0990f1-3497-4f2a-9fb0-05ae4c17e068" containerName="collect-profiles" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.242939 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="569a20cc-087a-4c93-b23b-af5c6b209b80" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.242997 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a0990f1-3497-4f2a-9fb0-05ae4c17e068" containerName="collect-profiles" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.244021 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.246327 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.247710 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.247792 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.248615 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.251921 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq"] Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.385486 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-svdrq\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.385891 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-svdrq\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.386341 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxxz7\" (UniqueName: \"kubernetes.io/projected/5b381c11-584e-4a17-b4a4-cd150f2d3d82-kube-api-access-pxxz7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-svdrq\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.488121 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxxz7\" (UniqueName: \"kubernetes.io/projected/5b381c11-584e-4a17-b4a4-cd150f2d3d82-kube-api-access-pxxz7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-svdrq\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.488406 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-svdrq\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.488643 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-svdrq\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.494976 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-svdrq\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.496064 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-svdrq\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.523487 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxxz7\" (UniqueName: \"kubernetes.io/projected/5b381c11-584e-4a17-b4a4-cd150f2d3d82-kube-api-access-pxxz7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-svdrq\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:45:08 crc kubenswrapper[4735]: I1001 10:45:08.574490 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:45:09 crc kubenswrapper[4735]: I1001 10:45:09.119409 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq"] Oct 01 10:45:09 crc kubenswrapper[4735]: W1001 10:45:09.123130 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b381c11_584e_4a17_b4a4_cd150f2d3d82.slice/crio-ebb3196a6fb32a009350934d62d6a5fdc2b6e92c22900a215d60ac2fc6322f93 WatchSource:0}: Error finding container ebb3196a6fb32a009350934d62d6a5fdc2b6e92c22900a215d60ac2fc6322f93: Status 404 returned error can't find the container with id ebb3196a6fb32a009350934d62d6a5fdc2b6e92c22900a215d60ac2fc6322f93 Oct 01 10:45:09 crc kubenswrapper[4735]: I1001 10:45:09.159072 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" event={"ID":"5b381c11-584e-4a17-b4a4-cd150f2d3d82","Type":"ContainerStarted","Data":"ebb3196a6fb32a009350934d62d6a5fdc2b6e92c22900a215d60ac2fc6322f93"} Oct 01 10:45:09 crc kubenswrapper[4735]: I1001 10:45:09.907244 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61dd2f37-7f60-42f5-a3d0-3b693d1e64be" path="/var/lib/kubelet/pods/61dd2f37-7f60-42f5-a3d0-3b693d1e64be/volumes" Oct 01 10:45:11 crc kubenswrapper[4735]: I1001 10:45:11.182350 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" event={"ID":"5b381c11-584e-4a17-b4a4-cd150f2d3d82","Type":"ContainerStarted","Data":"a55fca9a46952a3c749a36c40ed17ac108cd23ae3b607171dfc462845ebafe6f"} Oct 01 10:45:11 crc kubenswrapper[4735]: I1001 10:45:11.209853 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" podStartSLOduration=2.277354074 podStartE2EDuration="3.209827105s" podCreationTimestamp="2025-10-01 10:45:08 +0000 UTC" firstStartedPulling="2025-10-01 10:45:09.125882844 +0000 UTC m=+1667.818704106" lastFinishedPulling="2025-10-01 10:45:10.058355865 +0000 UTC m=+1668.751177137" observedRunningTime="2025-10-01 10:45:11.205915111 +0000 UTC m=+1669.898736413" watchObservedRunningTime="2025-10-01 10:45:11.209827105 +0000 UTC m=+1669.902648397" Oct 01 10:45:20 crc kubenswrapper[4735]: I1001 10:45:20.897403 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:45:20 crc kubenswrapper[4735]: E1001 10:45:20.898194 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:45:24 crc kubenswrapper[4735]: I1001 10:45:24.035144 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x5q6q"] Oct 01 10:45:24 crc kubenswrapper[4735]: I1001 10:45:24.042765 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x5q6q"] Oct 01 10:45:25 crc kubenswrapper[4735]: I1001 10:45:25.907569 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caf22975-46d3-4339-87d4-e693baddc266" path="/var/lib/kubelet/pods/caf22975-46d3-4339-87d4-e693baddc266/volumes" Oct 01 10:45:26 crc kubenswrapper[4735]: I1001 10:45:26.459115 4735 scope.go:117] "RemoveContainer" containerID="17635a0cbfdcdbf14c7c896afa961ca2f921b7899ebb651337f393fd1ba050e8" Oct 01 10:45:26 crc kubenswrapper[4735]: I1001 10:45:26.497355 4735 scope.go:117] "RemoveContainer" containerID="6a5fb0e85c2303b725219da07142a08eb01cd52b0fd1b3767b58c64bec46bcd0" Oct 01 10:45:26 crc kubenswrapper[4735]: I1001 10:45:26.544562 4735 scope.go:117] "RemoveContainer" containerID="ab4a2a9e23a314bf28665e4af012b73eccb4d07fb6eda7f8f4e44048c19f2b90" Oct 01 10:45:26 crc kubenswrapper[4735]: I1001 10:45:26.572113 4735 scope.go:117] "RemoveContainer" containerID="e21226580f7cea2f4269330f076615a590679fab57d62f8c9b421f201fbd3e93" Oct 01 10:45:26 crc kubenswrapper[4735]: I1001 10:45:26.625630 4735 scope.go:117] "RemoveContainer" containerID="96e7915ab4bdb82f58a4c39d45fd16e211a2836ed3782e7709840e491a5e735c" Oct 01 10:45:33 crc kubenswrapper[4735]: I1001 10:45:33.897956 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:45:33 crc kubenswrapper[4735]: E1001 10:45:33.899924 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:45:45 crc kubenswrapper[4735]: I1001 10:45:45.898378 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:45:45 crc kubenswrapper[4735]: E1001 10:45:45.899594 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:45:47 crc kubenswrapper[4735]: I1001 10:45:47.045561 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-hwtzm"] Oct 01 10:45:47 crc kubenswrapper[4735]: I1001 10:45:47.055432 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-hwtzm"] Oct 01 10:45:47 crc kubenswrapper[4735]: I1001 10:45:47.906841 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="199d79d9-6c17-4d68-af5f-623ab4ceb059" path="/var/lib/kubelet/pods/199d79d9-6c17-4d68-af5f-623ab4ceb059/volumes" Oct 01 10:45:52 crc kubenswrapper[4735]: I1001 10:45:52.036184 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2dqx7"] Oct 01 10:45:52 crc kubenswrapper[4735]: I1001 10:45:52.045365 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2dqx7"] Oct 01 10:45:53 crc kubenswrapper[4735]: I1001 10:45:53.909607 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d8e7eb-d10e-458f-ae05-e9c73c29b604" path="/var/lib/kubelet/pods/34d8e7eb-d10e-458f-ae05-e9c73c29b604/volumes" Oct 01 10:46:00 crc kubenswrapper[4735]: I1001 10:46:00.897548 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:46:00 crc kubenswrapper[4735]: E1001 10:46:00.898459 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:46:04 crc kubenswrapper[4735]: I1001 10:46:04.642223 4735 generic.go:334] "Generic (PLEG): container finished" podID="5b381c11-584e-4a17-b4a4-cd150f2d3d82" containerID="a55fca9a46952a3c749a36c40ed17ac108cd23ae3b607171dfc462845ebafe6f" exitCode=2 Oct 01 10:46:04 crc kubenswrapper[4735]: I1001 10:46:04.642305 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" event={"ID":"5b381c11-584e-4a17-b4a4-cd150f2d3d82","Type":"ContainerDied","Data":"a55fca9a46952a3c749a36c40ed17ac108cd23ae3b607171dfc462845ebafe6f"} Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.065707 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.175337 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-ssh-key\") pod \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.175433 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-inventory\") pod \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.175523 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxxz7\" (UniqueName: \"kubernetes.io/projected/5b381c11-584e-4a17-b4a4-cd150f2d3d82-kube-api-access-pxxz7\") pod \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\" (UID: \"5b381c11-584e-4a17-b4a4-cd150f2d3d82\") " Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.181583 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b381c11-584e-4a17-b4a4-cd150f2d3d82-kube-api-access-pxxz7" (OuterVolumeSpecName: "kube-api-access-pxxz7") pod "5b381c11-584e-4a17-b4a4-cd150f2d3d82" (UID: "5b381c11-584e-4a17-b4a4-cd150f2d3d82"). InnerVolumeSpecName "kube-api-access-pxxz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.201188 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-inventory" (OuterVolumeSpecName: "inventory") pod "5b381c11-584e-4a17-b4a4-cd150f2d3d82" (UID: "5b381c11-584e-4a17-b4a4-cd150f2d3d82"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.207786 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5b381c11-584e-4a17-b4a4-cd150f2d3d82" (UID: "5b381c11-584e-4a17-b4a4-cd150f2d3d82"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.278248 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.278279 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b381c11-584e-4a17-b4a4-cd150f2d3d82-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.278289 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxxz7\" (UniqueName: \"kubernetes.io/projected/5b381c11-584e-4a17-b4a4-cd150f2d3d82-kube-api-access-pxxz7\") on node \"crc\" DevicePath \"\"" Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.664631 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" event={"ID":"5b381c11-584e-4a17-b4a4-cd150f2d3d82","Type":"ContainerDied","Data":"ebb3196a6fb32a009350934d62d6a5fdc2b6e92c22900a215d60ac2fc6322f93"} Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.664685 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebb3196a6fb32a009350934d62d6a5fdc2b6e92c22900a215d60ac2fc6322f93" Oct 01 10:46:06 crc kubenswrapper[4735]: I1001 10:46:06.664713 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-svdrq" Oct 01 10:46:11 crc kubenswrapper[4735]: I1001 10:46:11.907139 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:46:11 crc kubenswrapper[4735]: E1001 10:46:11.914270 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.035690 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt"] Oct 01 10:46:14 crc kubenswrapper[4735]: E1001 10:46:14.036462 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b381c11-584e-4a17-b4a4-cd150f2d3d82" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.036481 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b381c11-584e-4a17-b4a4-cd150f2d3d82" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.036799 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b381c11-584e-4a17-b4a4-cd150f2d3d82" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.037600 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.041843 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.042109 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.042333 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.042576 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.051805 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt"] Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.121882 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.121998 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74nnn\" (UniqueName: \"kubernetes.io/projected/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-kube-api-access-74nnn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.122142 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.224042 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.224135 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74nnn\" (UniqueName: \"kubernetes.io/projected/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-kube-api-access-74nnn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.224194 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.230724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.231837 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.245389 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74nnn\" (UniqueName: \"kubernetes.io/projected/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-kube-api-access-74nnn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.366522 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:46:14 crc kubenswrapper[4735]: I1001 10:46:14.907848 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt"] Oct 01 10:46:15 crc kubenswrapper[4735]: I1001 10:46:15.757934 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" event={"ID":"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c","Type":"ContainerStarted","Data":"cef2fed232dcc58f8668cac1c6757ca9b2352619ad058b9ed4d87e71b31ae712"} Oct 01 10:46:15 crc kubenswrapper[4735]: I1001 10:46:15.759448 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" event={"ID":"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c","Type":"ContainerStarted","Data":"1475d6ee4776677395ad58b85f34bf1e9aeb2dc65ecd8bffcb6e54a2234e6ce1"} Oct 01 10:46:15 crc kubenswrapper[4735]: I1001 10:46:15.789129 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" podStartSLOduration=1.275216973 podStartE2EDuration="1.789105763s" podCreationTimestamp="2025-10-01 10:46:14 +0000 UTC" firstStartedPulling="2025-10-01 10:46:14.909777372 +0000 UTC m=+1733.602598644" lastFinishedPulling="2025-10-01 10:46:15.423666162 +0000 UTC m=+1734.116487434" observedRunningTime="2025-10-01 10:46:15.775865148 +0000 UTC m=+1734.468686410" watchObservedRunningTime="2025-10-01 10:46:15.789105763 +0000 UTC m=+1734.481927025" Oct 01 10:46:22 crc kubenswrapper[4735]: I1001 10:46:22.896608 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:46:22 crc kubenswrapper[4735]: E1001 10:46:22.897433 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:46:26 crc kubenswrapper[4735]: I1001 10:46:26.751916 4735 scope.go:117] "RemoveContainer" containerID="6d0079480629a7712148c34edf9c8b66c25b76d789d745048a3e3e522135452c" Oct 01 10:46:26 crc kubenswrapper[4735]: I1001 10:46:26.790427 4735 scope.go:117] "RemoveContainer" containerID="cdbbe339009129f1757f6cc96124ca7315285304666ccec60b42445640c4dd14" Oct 01 10:46:31 crc kubenswrapper[4735]: I1001 10:46:31.052135 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-gxllh"] Oct 01 10:46:31 crc kubenswrapper[4735]: I1001 10:46:31.061736 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-gxllh"] Oct 01 10:46:31 crc kubenswrapper[4735]: I1001 10:46:31.912049 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b35d0ee3-23af-4661-88bc-df962b75ced3" path="/var/lib/kubelet/pods/b35d0ee3-23af-4661-88bc-df962b75ced3/volumes" Oct 01 10:46:36 crc kubenswrapper[4735]: I1001 10:46:36.897382 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:46:36 crc kubenswrapper[4735]: E1001 10:46:36.898038 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:46:50 crc kubenswrapper[4735]: I1001 10:46:50.897484 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:46:50 crc kubenswrapper[4735]: E1001 10:46:50.898276 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:47:01 crc kubenswrapper[4735]: I1001 10:47:01.206951 4735 generic.go:334] "Generic (PLEG): container finished" podID="36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c" containerID="cef2fed232dcc58f8668cac1c6757ca9b2352619ad058b9ed4d87e71b31ae712" exitCode=0 Oct 01 10:47:01 crc kubenswrapper[4735]: I1001 10:47:01.207038 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" event={"ID":"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c","Type":"ContainerDied","Data":"cef2fed232dcc58f8668cac1c6757ca9b2352619ad058b9ed4d87e71b31ae712"} Oct 01 10:47:02 crc kubenswrapper[4735]: I1001 10:47:02.594770 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:47:02 crc kubenswrapper[4735]: I1001 10:47:02.748747 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74nnn\" (UniqueName: \"kubernetes.io/projected/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-kube-api-access-74nnn\") pod \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " Oct 01 10:47:02 crc kubenswrapper[4735]: I1001 10:47:02.748830 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-inventory\") pod \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " Oct 01 10:47:02 crc kubenswrapper[4735]: I1001 10:47:02.748938 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-ssh-key\") pod \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\" (UID: \"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c\") " Oct 01 10:47:02 crc kubenswrapper[4735]: I1001 10:47:02.754801 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-kube-api-access-74nnn" (OuterVolumeSpecName: "kube-api-access-74nnn") pod "36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c" (UID: "36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c"). InnerVolumeSpecName "kube-api-access-74nnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:47:02 crc kubenswrapper[4735]: I1001 10:47:02.787475 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c" (UID: "36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:47:02 crc kubenswrapper[4735]: I1001 10:47:02.812724 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-inventory" (OuterVolumeSpecName: "inventory") pod "36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c" (UID: "36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:47:02 crc kubenswrapper[4735]: I1001 10:47:02.850593 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74nnn\" (UniqueName: \"kubernetes.io/projected/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-kube-api-access-74nnn\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:02 crc kubenswrapper[4735]: I1001 10:47:02.850620 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:02 crc kubenswrapper[4735]: I1001 10:47:02.850629 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.224552 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" event={"ID":"36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c","Type":"ContainerDied","Data":"1475d6ee4776677395ad58b85f34bf1e9aeb2dc65ecd8bffcb6e54a2234e6ce1"} Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.224599 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1475d6ee4776677395ad58b85f34bf1e9aeb2dc65ecd8bffcb6e54a2234e6ce1" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.224649 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.313855 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dpr46"] Oct 01 10:47:03 crc kubenswrapper[4735]: E1001 10:47:03.314307 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.314327 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.314543 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.315251 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.317441 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.318436 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.321853 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.324860 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.327204 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dpr46"] Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.460348 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hndjz\" (UniqueName: \"kubernetes.io/projected/57690d02-83f0-47e6-a66f-da0ab4138820-kube-api-access-hndjz\") pod \"ssh-known-hosts-edpm-deployment-dpr46\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.460690 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dpr46\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.460883 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dpr46\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.562938 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dpr46\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.563004 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hndjz\" (UniqueName: \"kubernetes.io/projected/57690d02-83f0-47e6-a66f-da0ab4138820-kube-api-access-hndjz\") pod \"ssh-known-hosts-edpm-deployment-dpr46\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.563053 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dpr46\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.567114 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dpr46\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.567461 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dpr46\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.579841 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hndjz\" (UniqueName: \"kubernetes.io/projected/57690d02-83f0-47e6-a66f-da0ab4138820-kube-api-access-hndjz\") pod \"ssh-known-hosts-edpm-deployment-dpr46\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:03 crc kubenswrapper[4735]: I1001 10:47:03.630937 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:04 crc kubenswrapper[4735]: I1001 10:47:04.121368 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dpr46"] Oct 01 10:47:04 crc kubenswrapper[4735]: I1001 10:47:04.232005 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" event={"ID":"57690d02-83f0-47e6-a66f-da0ab4138820","Type":"ContainerStarted","Data":"de7f48e1686d4a7ce85b76dc46a38ea922191b9f4ff4b06c0ff238e2f63db142"} Oct 01 10:47:04 crc kubenswrapper[4735]: I1001 10:47:04.897774 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:47:04 crc kubenswrapper[4735]: E1001 10:47:04.898431 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:47:05 crc kubenswrapper[4735]: I1001 10:47:05.241826 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" event={"ID":"57690d02-83f0-47e6-a66f-da0ab4138820","Type":"ContainerStarted","Data":"441bfcbbb8e8c010fb9d0f26d6253176ecb82d7d09ff7cbe39bdb541dd506105"} Oct 01 10:47:05 crc kubenswrapper[4735]: I1001 10:47:05.267750 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" podStartSLOduration=1.589616355 podStartE2EDuration="2.26773017s" podCreationTimestamp="2025-10-01 10:47:03 +0000 UTC" firstStartedPulling="2025-10-01 10:47:04.127564518 +0000 UTC m=+1782.820385780" lastFinishedPulling="2025-10-01 10:47:04.805678333 +0000 UTC m=+1783.498499595" observedRunningTime="2025-10-01 10:47:05.256419386 +0000 UTC m=+1783.949240668" watchObservedRunningTime="2025-10-01 10:47:05.26773017 +0000 UTC m=+1783.960551442" Oct 01 10:47:12 crc kubenswrapper[4735]: I1001 10:47:12.308799 4735 generic.go:334] "Generic (PLEG): container finished" podID="57690d02-83f0-47e6-a66f-da0ab4138820" containerID="441bfcbbb8e8c010fb9d0f26d6253176ecb82d7d09ff7cbe39bdb541dd506105" exitCode=0 Oct 01 10:47:12 crc kubenswrapper[4735]: I1001 10:47:12.308870 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" event={"ID":"57690d02-83f0-47e6-a66f-da0ab4138820","Type":"ContainerDied","Data":"441bfcbbb8e8c010fb9d0f26d6253176ecb82d7d09ff7cbe39bdb541dd506105"} Oct 01 10:47:13 crc kubenswrapper[4735]: I1001 10:47:13.729018 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:13 crc kubenswrapper[4735]: I1001 10:47:13.866802 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-inventory-0\") pod \"57690d02-83f0-47e6-a66f-da0ab4138820\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " Oct 01 10:47:13 crc kubenswrapper[4735]: I1001 10:47:13.866947 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hndjz\" (UniqueName: \"kubernetes.io/projected/57690d02-83f0-47e6-a66f-da0ab4138820-kube-api-access-hndjz\") pod \"57690d02-83f0-47e6-a66f-da0ab4138820\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " Oct 01 10:47:13 crc kubenswrapper[4735]: I1001 10:47:13.867054 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-ssh-key-openstack-edpm-ipam\") pod \"57690d02-83f0-47e6-a66f-da0ab4138820\" (UID: \"57690d02-83f0-47e6-a66f-da0ab4138820\") " Oct 01 10:47:13 crc kubenswrapper[4735]: I1001 10:47:13.875114 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57690d02-83f0-47e6-a66f-da0ab4138820-kube-api-access-hndjz" (OuterVolumeSpecName: "kube-api-access-hndjz") pod "57690d02-83f0-47e6-a66f-da0ab4138820" (UID: "57690d02-83f0-47e6-a66f-da0ab4138820"). InnerVolumeSpecName "kube-api-access-hndjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:47:13 crc kubenswrapper[4735]: I1001 10:47:13.913561 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "57690d02-83f0-47e6-a66f-da0ab4138820" (UID: "57690d02-83f0-47e6-a66f-da0ab4138820"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:47:13 crc kubenswrapper[4735]: I1001 10:47:13.913874 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "57690d02-83f0-47e6-a66f-da0ab4138820" (UID: "57690d02-83f0-47e6-a66f-da0ab4138820"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:47:13 crc kubenswrapper[4735]: I1001 10:47:13.970639 4735 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:13 crc kubenswrapper[4735]: I1001 10:47:13.970693 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hndjz\" (UniqueName: \"kubernetes.io/projected/57690d02-83f0-47e6-a66f-da0ab4138820-kube-api-access-hndjz\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:13 crc kubenswrapper[4735]: I1001 10:47:13.970718 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57690d02-83f0-47e6-a66f-da0ab4138820-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.327455 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" event={"ID":"57690d02-83f0-47e6-a66f-da0ab4138820","Type":"ContainerDied","Data":"de7f48e1686d4a7ce85b76dc46a38ea922191b9f4ff4b06c0ff238e2f63db142"} Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.327556 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de7f48e1686d4a7ce85b76dc46a38ea922191b9f4ff4b06c0ff238e2f63db142" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.327583 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dpr46" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.418404 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg"] Oct 01 10:47:14 crc kubenswrapper[4735]: E1001 10:47:14.418981 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57690d02-83f0-47e6-a66f-da0ab4138820" containerName="ssh-known-hosts-edpm-deployment" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.419005 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="57690d02-83f0-47e6-a66f-da0ab4138820" containerName="ssh-known-hosts-edpm-deployment" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.419303 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="57690d02-83f0-47e6-a66f-da0ab4138820" containerName="ssh-known-hosts-edpm-deployment" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.420092 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.422274 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.423131 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.423723 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.425738 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.428860 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg"] Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.579301 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c8mbg\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.579614 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cssl\" (UniqueName: \"kubernetes.io/projected/7223ea14-1a97-4c77-bbab-5f2919606539-kube-api-access-9cssl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c8mbg\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.579644 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c8mbg\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.681491 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c8mbg\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.681669 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cssl\" (UniqueName: \"kubernetes.io/projected/7223ea14-1a97-4c77-bbab-5f2919606539-kube-api-access-9cssl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c8mbg\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.681700 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c8mbg\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.686141 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c8mbg\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.686800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c8mbg\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.699654 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cssl\" (UniqueName: \"kubernetes.io/projected/7223ea14-1a97-4c77-bbab-5f2919606539-kube-api-access-9cssl\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-c8mbg\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:14 crc kubenswrapper[4735]: I1001 10:47:14.747467 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:15 crc kubenswrapper[4735]: I1001 10:47:15.261635 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg"] Oct 01 10:47:15 crc kubenswrapper[4735]: I1001 10:47:15.336392 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" event={"ID":"7223ea14-1a97-4c77-bbab-5f2919606539","Type":"ContainerStarted","Data":"dffd188d6efd72588b686d3a398f661b1c8a62a1bb89f46517d12e7770ccd09d"} Oct 01 10:47:17 crc kubenswrapper[4735]: I1001 10:47:17.360800 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" event={"ID":"7223ea14-1a97-4c77-bbab-5f2919606539","Type":"ContainerStarted","Data":"575375bb85f45cf311a173659146fdb0b716bde5856246ea2bb71eb5d071d25f"} Oct 01 10:47:17 crc kubenswrapper[4735]: I1001 10:47:17.403588 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" podStartSLOduration=2.2465517090000002 podStartE2EDuration="3.403564843s" podCreationTimestamp="2025-10-01 10:47:14 +0000 UTC" firstStartedPulling="2025-10-01 10:47:15.270466628 +0000 UTC m=+1793.963287890" lastFinishedPulling="2025-10-01 10:47:16.427479762 +0000 UTC m=+1795.120301024" observedRunningTime="2025-10-01 10:47:17.382113567 +0000 UTC m=+1796.074934839" watchObservedRunningTime="2025-10-01 10:47:17.403564843 +0000 UTC m=+1796.096386115" Oct 01 10:47:17 crc kubenswrapper[4735]: I1001 10:47:17.897799 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:47:17 crc kubenswrapper[4735]: E1001 10:47:17.898789 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:47:25 crc kubenswrapper[4735]: I1001 10:47:25.443522 4735 generic.go:334] "Generic (PLEG): container finished" podID="7223ea14-1a97-4c77-bbab-5f2919606539" containerID="575375bb85f45cf311a173659146fdb0b716bde5856246ea2bb71eb5d071d25f" exitCode=0 Oct 01 10:47:25 crc kubenswrapper[4735]: I1001 10:47:25.443528 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" event={"ID":"7223ea14-1a97-4c77-bbab-5f2919606539","Type":"ContainerDied","Data":"575375bb85f45cf311a173659146fdb0b716bde5856246ea2bb71eb5d071d25f"} Oct 01 10:47:26 crc kubenswrapper[4735]: I1001 10:47:26.835407 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:26 crc kubenswrapper[4735]: I1001 10:47:26.874929 4735 scope.go:117] "RemoveContainer" containerID="f5c95e6647646ffa4b3fa47094851620a1358671915e5a25be4b2f27ad7cc190" Oct 01 10:47:26 crc kubenswrapper[4735]: I1001 10:47:26.917155 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-ssh-key\") pod \"7223ea14-1a97-4c77-bbab-5f2919606539\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " Oct 01 10:47:26 crc kubenswrapper[4735]: I1001 10:47:26.917298 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-inventory\") pod \"7223ea14-1a97-4c77-bbab-5f2919606539\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " Oct 01 10:47:26 crc kubenswrapper[4735]: I1001 10:47:26.917447 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cssl\" (UniqueName: \"kubernetes.io/projected/7223ea14-1a97-4c77-bbab-5f2919606539-kube-api-access-9cssl\") pod \"7223ea14-1a97-4c77-bbab-5f2919606539\" (UID: \"7223ea14-1a97-4c77-bbab-5f2919606539\") " Oct 01 10:47:26 crc kubenswrapper[4735]: I1001 10:47:26.924076 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7223ea14-1a97-4c77-bbab-5f2919606539-kube-api-access-9cssl" (OuterVolumeSpecName: "kube-api-access-9cssl") pod "7223ea14-1a97-4c77-bbab-5f2919606539" (UID: "7223ea14-1a97-4c77-bbab-5f2919606539"). InnerVolumeSpecName "kube-api-access-9cssl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:47:26 crc kubenswrapper[4735]: I1001 10:47:26.950011 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-inventory" (OuterVolumeSpecName: "inventory") pod "7223ea14-1a97-4c77-bbab-5f2919606539" (UID: "7223ea14-1a97-4c77-bbab-5f2919606539"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:47:26 crc kubenswrapper[4735]: I1001 10:47:26.951336 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7223ea14-1a97-4c77-bbab-5f2919606539" (UID: "7223ea14-1a97-4c77-bbab-5f2919606539"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.020094 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cssl\" (UniqueName: \"kubernetes.io/projected/7223ea14-1a97-4c77-bbab-5f2919606539-kube-api-access-9cssl\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.020135 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.020148 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7223ea14-1a97-4c77-bbab-5f2919606539-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.464218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" event={"ID":"7223ea14-1a97-4c77-bbab-5f2919606539","Type":"ContainerDied","Data":"dffd188d6efd72588b686d3a398f661b1c8a62a1bb89f46517d12e7770ccd09d"} Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.464289 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dffd188d6efd72588b686d3a398f661b1c8a62a1bb89f46517d12e7770ccd09d" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.464235 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-c8mbg" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.530898 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8"] Oct 01 10:47:27 crc kubenswrapper[4735]: E1001 10:47:27.531810 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7223ea14-1a97-4c77-bbab-5f2919606539" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.531837 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7223ea14-1a97-4c77-bbab-5f2919606539" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.532084 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7223ea14-1a97-4c77-bbab-5f2919606539" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.532903 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.542229 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.542701 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.542789 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.543113 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.563096 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8"] Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.630553 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szsb5\" (UniqueName: \"kubernetes.io/projected/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-kube-api-access-szsb5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.630715 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.630847 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.732085 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szsb5\" (UniqueName: \"kubernetes.io/projected/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-kube-api-access-szsb5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.732195 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.732288 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.738028 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.738281 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.748071 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szsb5\" (UniqueName: \"kubernetes.io/projected/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-kube-api-access-szsb5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:27 crc kubenswrapper[4735]: I1001 10:47:27.851115 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:28 crc kubenswrapper[4735]: I1001 10:47:28.401627 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8"] Oct 01 10:47:28 crc kubenswrapper[4735]: W1001 10:47:28.406568 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa85eebe_cbdf_41f6_b47b_5e844222f3fe.slice/crio-359b80c0f45f3eb5d4a1586dcd3f6f98b1dd4cc29cc4692e0469c4b0855a78bf WatchSource:0}: Error finding container 359b80c0f45f3eb5d4a1586dcd3f6f98b1dd4cc29cc4692e0469c4b0855a78bf: Status 404 returned error can't find the container with id 359b80c0f45f3eb5d4a1586dcd3f6f98b1dd4cc29cc4692e0469c4b0855a78bf Oct 01 10:47:28 crc kubenswrapper[4735]: I1001 10:47:28.472117 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" event={"ID":"fa85eebe-cbdf-41f6-b47b-5e844222f3fe","Type":"ContainerStarted","Data":"359b80c0f45f3eb5d4a1586dcd3f6f98b1dd4cc29cc4692e0469c4b0855a78bf"} Oct 01 10:47:29 crc kubenswrapper[4735]: I1001 10:47:29.897691 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:47:29 crc kubenswrapper[4735]: E1001 10:47:29.898229 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:47:33 crc kubenswrapper[4735]: I1001 10:47:33.540982 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" event={"ID":"fa85eebe-cbdf-41f6-b47b-5e844222f3fe","Type":"ContainerStarted","Data":"f56842c4f3d00e01a4ce16bd65b84bba4dc746f8fb124671b21474d36dbae28b"} Oct 01 10:47:33 crc kubenswrapper[4735]: I1001 10:47:33.563953 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" podStartSLOduration=1.762199793 podStartE2EDuration="6.563931166s" podCreationTimestamp="2025-10-01 10:47:27 +0000 UTC" firstStartedPulling="2025-10-01 10:47:28.410536072 +0000 UTC m=+1807.103357344" lastFinishedPulling="2025-10-01 10:47:33.212267455 +0000 UTC m=+1811.905088717" observedRunningTime="2025-10-01 10:47:33.560663589 +0000 UTC m=+1812.253484861" watchObservedRunningTime="2025-10-01 10:47:33.563931166 +0000 UTC m=+1812.256752438" Oct 01 10:47:42 crc kubenswrapper[4735]: I1001 10:47:42.897142 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:47:42 crc kubenswrapper[4735]: E1001 10:47:42.898088 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:47:43 crc kubenswrapper[4735]: I1001 10:47:43.641174 4735 generic.go:334] "Generic (PLEG): container finished" podID="fa85eebe-cbdf-41f6-b47b-5e844222f3fe" containerID="f56842c4f3d00e01a4ce16bd65b84bba4dc746f8fb124671b21474d36dbae28b" exitCode=0 Oct 01 10:47:43 crc kubenswrapper[4735]: I1001 10:47:43.641264 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" event={"ID":"fa85eebe-cbdf-41f6-b47b-5e844222f3fe","Type":"ContainerDied","Data":"f56842c4f3d00e01a4ce16bd65b84bba4dc746f8fb124671b21474d36dbae28b"} Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.116730 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.204951 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szsb5\" (UniqueName: \"kubernetes.io/projected/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-kube-api-access-szsb5\") pod \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.205185 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-inventory\") pod \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.205284 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-ssh-key\") pod \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\" (UID: \"fa85eebe-cbdf-41f6-b47b-5e844222f3fe\") " Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.215058 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-kube-api-access-szsb5" (OuterVolumeSpecName: "kube-api-access-szsb5") pod "fa85eebe-cbdf-41f6-b47b-5e844222f3fe" (UID: "fa85eebe-cbdf-41f6-b47b-5e844222f3fe"). InnerVolumeSpecName "kube-api-access-szsb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.244704 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-inventory" (OuterVolumeSpecName: "inventory") pod "fa85eebe-cbdf-41f6-b47b-5e844222f3fe" (UID: "fa85eebe-cbdf-41f6-b47b-5e844222f3fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.254705 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fa85eebe-cbdf-41f6-b47b-5e844222f3fe" (UID: "fa85eebe-cbdf-41f6-b47b-5e844222f3fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.307631 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.307683 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.307703 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szsb5\" (UniqueName: \"kubernetes.io/projected/fa85eebe-cbdf-41f6-b47b-5e844222f3fe-kube-api-access-szsb5\") on node \"crc\" DevicePath \"\"" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.661302 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" event={"ID":"fa85eebe-cbdf-41f6-b47b-5e844222f3fe","Type":"ContainerDied","Data":"359b80c0f45f3eb5d4a1586dcd3f6f98b1dd4cc29cc4692e0469c4b0855a78bf"} Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.661354 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="359b80c0f45f3eb5d4a1586dcd3f6f98b1dd4cc29cc4692e0469c4b0855a78bf" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.661366 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.771275 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z"] Oct 01 10:47:45 crc kubenswrapper[4735]: E1001 10:47:45.771645 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa85eebe-cbdf-41f6-b47b-5e844222f3fe" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.771663 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa85eebe-cbdf-41f6-b47b-5e844222f3fe" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.771863 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa85eebe-cbdf-41f6-b47b-5e844222f3fe" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.772449 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.776034 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.776116 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.776701 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.776935 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.777435 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.777839 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.778888 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.779997 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.800314 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z"] Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922060 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922110 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922155 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922197 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922233 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922279 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922322 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922343 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922373 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922411 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922448 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922467 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922533 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:45 crc kubenswrapper[4735]: I1001 10:47:45.922589 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw47p\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-kube-api-access-fw47p\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.024362 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.024418 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.024459 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.024542 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.024569 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.024612 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.024690 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw47p\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-kube-api-access-fw47p\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.024719 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.024766 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.025288 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.025690 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.025748 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.025990 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.026068 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.029369 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.030253 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.030341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.030480 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.032366 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.033446 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.033489 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.034774 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.035816 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.036361 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.036434 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.037261 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.038051 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.046804 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw47p\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-kube-api-access-fw47p\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.092315 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:47:46 crc kubenswrapper[4735]: I1001 10:47:46.691851 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z"] Oct 01 10:47:46 crc kubenswrapper[4735]: W1001 10:47:46.700255 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77d6c2ca_cb77_4582_b644_d077086a29b5.slice/crio-60f4922372f455438fa2d60c92f96df2f85b8695c179925afe32333494e72731 WatchSource:0}: Error finding container 60f4922372f455438fa2d60c92f96df2f85b8695c179925afe32333494e72731: Status 404 returned error can't find the container with id 60f4922372f455438fa2d60c92f96df2f85b8695c179925afe32333494e72731 Oct 01 10:47:47 crc kubenswrapper[4735]: I1001 10:47:47.681314 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" event={"ID":"77d6c2ca-cb77-4582-b644-d077086a29b5","Type":"ContainerStarted","Data":"8f7417c90c744adccf7683b9d9df0f09e8a8e7cb14fe1ebfb1b7fdb501e20e5b"} Oct 01 10:47:47 crc kubenswrapper[4735]: I1001 10:47:47.681666 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" event={"ID":"77d6c2ca-cb77-4582-b644-d077086a29b5","Type":"ContainerStarted","Data":"60f4922372f455438fa2d60c92f96df2f85b8695c179925afe32333494e72731"} Oct 01 10:47:57 crc kubenswrapper[4735]: I1001 10:47:57.897463 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:47:57 crc kubenswrapper[4735]: E1001 10:47:57.898262 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:48:12 crc kubenswrapper[4735]: I1001 10:48:12.897459 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:48:12 crc kubenswrapper[4735]: E1001 10:48:12.898262 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:48:25 crc kubenswrapper[4735]: I1001 10:48:25.043019 4735 generic.go:334] "Generic (PLEG): container finished" podID="77d6c2ca-cb77-4582-b644-d077086a29b5" containerID="8f7417c90c744adccf7683b9d9df0f09e8a8e7cb14fe1ebfb1b7fdb501e20e5b" exitCode=0 Oct 01 10:48:25 crc kubenswrapper[4735]: I1001 10:48:25.043145 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" event={"ID":"77d6c2ca-cb77-4582-b644-d077086a29b5","Type":"ContainerDied","Data":"8f7417c90c744adccf7683b9d9df0f09e8a8e7cb14fe1ebfb1b7fdb501e20e5b"} Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.421814 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.449737 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.449839 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-telemetry-combined-ca-bundle\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.449871 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ssh-key\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.449898 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-repo-setup-combined-ca-bundle\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.449944 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-nova-combined-ca-bundle\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.450000 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ovn-combined-ca-bundle\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.450036 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw47p\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-kube-api-access-fw47p\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.450061 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-libvirt-combined-ca-bundle\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.450098 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.450123 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-inventory\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.450167 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.450273 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.450330 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-bootstrap-combined-ca-bundle\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.450358 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-neutron-metadata-combined-ca-bundle\") pod \"77d6c2ca-cb77-4582-b644-d077086a29b5\" (UID: \"77d6c2ca-cb77-4582-b644-d077086a29b5\") " Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.457266 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.457301 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.458273 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-kube-api-access-fw47p" (OuterVolumeSpecName: "kube-api-access-fw47p") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "kube-api-access-fw47p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.458798 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.458995 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.459806 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.460086 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.460146 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.460656 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.460873 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.461725 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.464673 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.484644 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.485785 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-inventory" (OuterVolumeSpecName: "inventory") pod "77d6c2ca-cb77-4582-b644-d077086a29b5" (UID: "77d6c2ca-cb77-4582-b644-d077086a29b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553172 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553212 4735 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553229 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553241 4735 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553254 4735 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553267 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553277 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw47p\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-kube-api-access-fw47p\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553287 4735 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553298 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553311 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553323 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553335 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77d6c2ca-cb77-4582-b644-d077086a29b5-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553349 4735 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.553363 4735 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d6c2ca-cb77-4582-b644-d077086a29b5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:48:26 crc kubenswrapper[4735]: I1001 10:48:26.898339 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:48:26 crc kubenswrapper[4735]: E1001 10:48:26.899017 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.062123 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" event={"ID":"77d6c2ca-cb77-4582-b644-d077086a29b5","Type":"ContainerDied","Data":"60f4922372f455438fa2d60c92f96df2f85b8695c179925afe32333494e72731"} Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.062188 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60f4922372f455438fa2d60c92f96df2f85b8695c179925afe32333494e72731" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.062234 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.148781 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9"] Oct 01 10:48:27 crc kubenswrapper[4735]: E1001 10:48:27.149188 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77d6c2ca-cb77-4582-b644-d077086a29b5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.149207 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="77d6c2ca-cb77-4582-b644-d077086a29b5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.149385 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="77d6c2ca-cb77-4582-b644-d077086a29b5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.150040 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.151986 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.156347 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.157844 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.157935 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.158157 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.168357 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9"] Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.266911 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.266957 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d5532968-8896-44ba-a120-62bacb3bf10a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.267008 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.267039 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdhdz\" (UniqueName: \"kubernetes.io/projected/d5532968-8896-44ba-a120-62bacb3bf10a-kube-api-access-cdhdz\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.267235 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.369441 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.369538 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.369579 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d5532968-8896-44ba-a120-62bacb3bf10a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.369631 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.369662 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdhdz\" (UniqueName: \"kubernetes.io/projected/d5532968-8896-44ba-a120-62bacb3bf10a-kube-api-access-cdhdz\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.370810 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d5532968-8896-44ba-a120-62bacb3bf10a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.374120 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.374269 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.374923 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.386286 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdhdz\" (UniqueName: \"kubernetes.io/projected/d5532968-8896-44ba-a120-62bacb3bf10a-kube-api-access-cdhdz\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-srqk9\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.469475 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.974148 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9"] Oct 01 10:48:27 crc kubenswrapper[4735]: I1001 10:48:27.983862 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 10:48:28 crc kubenswrapper[4735]: I1001 10:48:28.074980 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" event={"ID":"d5532968-8896-44ba-a120-62bacb3bf10a","Type":"ContainerStarted","Data":"c6cf10205b883a53ffe8a151204d54d8a97ce0873aa63e1c1a25e2fe9942a68b"} Oct 01 10:48:29 crc kubenswrapper[4735]: I1001 10:48:29.090117 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" event={"ID":"d5532968-8896-44ba-a120-62bacb3bf10a","Type":"ContainerStarted","Data":"1d5f6be621fa6546c6f3e9267ec76efb818a89fd512cf19308ef3c6aff583c50"} Oct 01 10:48:29 crc kubenswrapper[4735]: I1001 10:48:29.120097 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" podStartSLOduration=1.6303664709999999 podStartE2EDuration="2.120067461s" podCreationTimestamp="2025-10-01 10:48:27 +0000 UTC" firstStartedPulling="2025-10-01 10:48:27.983654041 +0000 UTC m=+1866.676475303" lastFinishedPulling="2025-10-01 10:48:28.473355021 +0000 UTC m=+1867.166176293" observedRunningTime="2025-10-01 10:48:29.114544213 +0000 UTC m=+1867.807365515" watchObservedRunningTime="2025-10-01 10:48:29.120067461 +0000 UTC m=+1867.812888763" Oct 01 10:48:38 crc kubenswrapper[4735]: I1001 10:48:38.897299 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:48:40 crc kubenswrapper[4735]: I1001 10:48:40.191383 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"21d9d4873dd057c624c81ffd0bd8bca0905cb18a9521b7de29a283d745338bf7"} Oct 01 10:49:33 crc kubenswrapper[4735]: I1001 10:49:33.767076 4735 generic.go:334] "Generic (PLEG): container finished" podID="d5532968-8896-44ba-a120-62bacb3bf10a" containerID="1d5f6be621fa6546c6f3e9267ec76efb818a89fd512cf19308ef3c6aff583c50" exitCode=0 Oct 01 10:49:33 crc kubenswrapper[4735]: I1001 10:49:33.767240 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" event={"ID":"d5532968-8896-44ba-a120-62bacb3bf10a","Type":"ContainerDied","Data":"1d5f6be621fa6546c6f3e9267ec76efb818a89fd512cf19308ef3c6aff583c50"} Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.219967 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.330611 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d5532968-8896-44ba-a120-62bacb3bf10a-ovncontroller-config-0\") pod \"d5532968-8896-44ba-a120-62bacb3bf10a\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.330801 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-inventory\") pod \"d5532968-8896-44ba-a120-62bacb3bf10a\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.330868 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ssh-key\") pod \"d5532968-8896-44ba-a120-62bacb3bf10a\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.331221 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ovn-combined-ca-bundle\") pod \"d5532968-8896-44ba-a120-62bacb3bf10a\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.331296 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdhdz\" (UniqueName: \"kubernetes.io/projected/d5532968-8896-44ba-a120-62bacb3bf10a-kube-api-access-cdhdz\") pod \"d5532968-8896-44ba-a120-62bacb3bf10a\" (UID: \"d5532968-8896-44ba-a120-62bacb3bf10a\") " Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.339274 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5532968-8896-44ba-a120-62bacb3bf10a-kube-api-access-cdhdz" (OuterVolumeSpecName: "kube-api-access-cdhdz") pod "d5532968-8896-44ba-a120-62bacb3bf10a" (UID: "d5532968-8896-44ba-a120-62bacb3bf10a"). InnerVolumeSpecName "kube-api-access-cdhdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.342716 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d5532968-8896-44ba-a120-62bacb3bf10a" (UID: "d5532968-8896-44ba-a120-62bacb3bf10a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.361228 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5532968-8896-44ba-a120-62bacb3bf10a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d5532968-8896-44ba-a120-62bacb3bf10a" (UID: "d5532968-8896-44ba-a120-62bacb3bf10a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.367053 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d5532968-8896-44ba-a120-62bacb3bf10a" (UID: "d5532968-8896-44ba-a120-62bacb3bf10a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.367651 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-inventory" (OuterVolumeSpecName: "inventory") pod "d5532968-8896-44ba-a120-62bacb3bf10a" (UID: "d5532968-8896-44ba-a120-62bacb3bf10a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.433971 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.434012 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdhdz\" (UniqueName: \"kubernetes.io/projected/d5532968-8896-44ba-a120-62bacb3bf10a-kube-api-access-cdhdz\") on node \"crc\" DevicePath \"\"" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.434025 4735 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d5532968-8896-44ba-a120-62bacb3bf10a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.434039 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.434051 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5532968-8896-44ba-a120-62bacb3bf10a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.788847 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" event={"ID":"d5532968-8896-44ba-a120-62bacb3bf10a","Type":"ContainerDied","Data":"c6cf10205b883a53ffe8a151204d54d8a97ce0873aa63e1c1a25e2fe9942a68b"} Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.789112 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6cf10205b883a53ffe8a151204d54d8a97ce0873aa63e1c1a25e2fe9942a68b" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.788959 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-srqk9" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.908877 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw"] Oct 01 10:49:35 crc kubenswrapper[4735]: E1001 10:49:35.909235 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5532968-8896-44ba-a120-62bacb3bf10a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.909256 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5532968-8896-44ba-a120-62bacb3bf10a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.909445 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5532968-8896-44ba-a120-62bacb3bf10a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.910127 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.911739 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw"] Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.919129 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.919319 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.919529 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.920358 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.920928 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.931024 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.943036 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.943144 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.943421 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.943563 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btsgc\" (UniqueName: \"kubernetes.io/projected/9d75fc81-9126-4b8e-b623-47a8c65adb8f-kube-api-access-btsgc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.943637 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:35 crc kubenswrapper[4735]: I1001 10:49:35.943665 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.045208 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.045323 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.045383 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.045440 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btsgc\" (UniqueName: \"kubernetes.io/projected/9d75fc81-9126-4b8e-b623-47a8c65adb8f-kube-api-access-btsgc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.045476 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.045539 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.049035 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.049346 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.049973 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.054465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.060524 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.062331 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btsgc\" (UniqueName: \"kubernetes.io/projected/9d75fc81-9126-4b8e-b623-47a8c65adb8f-kube-api-access-btsgc\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.235816 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.745125 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw"] Oct 01 10:49:36 crc kubenswrapper[4735]: W1001 10:49:36.747312 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d75fc81_9126_4b8e_b623_47a8c65adb8f.slice/crio-4597361e1f1b12ee2bd917765022a76a3717bbd224fd2110e153241065db0897 WatchSource:0}: Error finding container 4597361e1f1b12ee2bd917765022a76a3717bbd224fd2110e153241065db0897: Status 404 returned error can't find the container with id 4597361e1f1b12ee2bd917765022a76a3717bbd224fd2110e153241065db0897 Oct 01 10:49:36 crc kubenswrapper[4735]: I1001 10:49:36.798451 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" event={"ID":"9d75fc81-9126-4b8e-b623-47a8c65adb8f","Type":"ContainerStarted","Data":"4597361e1f1b12ee2bd917765022a76a3717bbd224fd2110e153241065db0897"} Oct 01 10:49:38 crc kubenswrapper[4735]: I1001 10:49:38.833942 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" event={"ID":"9d75fc81-9126-4b8e-b623-47a8c65adb8f","Type":"ContainerStarted","Data":"a2f672a46660837d53b1fb6bed4b6d269c6eda16086eaf4b21bd325e557c06aa"} Oct 01 10:49:38 crc kubenswrapper[4735]: I1001 10:49:38.854255 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" podStartSLOduration=2.561858141 podStartE2EDuration="3.854226853s" podCreationTimestamp="2025-10-01 10:49:35 +0000 UTC" firstStartedPulling="2025-10-01 10:49:36.751805431 +0000 UTC m=+1935.444626693" lastFinishedPulling="2025-10-01 10:49:38.044174123 +0000 UTC m=+1936.736995405" observedRunningTime="2025-10-01 10:49:38.852149587 +0000 UTC m=+1937.544970909" watchObservedRunningTime="2025-10-01 10:49:38.854226853 +0000 UTC m=+1937.547048155" Oct 01 10:50:28 crc kubenswrapper[4735]: I1001 10:50:28.330560 4735 generic.go:334] "Generic (PLEG): container finished" podID="9d75fc81-9126-4b8e-b623-47a8c65adb8f" containerID="a2f672a46660837d53b1fb6bed4b6d269c6eda16086eaf4b21bd325e557c06aa" exitCode=0 Oct 01 10:50:28 crc kubenswrapper[4735]: I1001 10:50:28.330774 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" event={"ID":"9d75fc81-9126-4b8e-b623-47a8c65adb8f","Type":"ContainerDied","Data":"a2f672a46660837d53b1fb6bed4b6d269c6eda16086eaf4b21bd325e557c06aa"} Oct 01 10:50:29 crc kubenswrapper[4735]: I1001 10:50:29.792800 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:50:29 crc kubenswrapper[4735]: I1001 10:50:29.969079 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-inventory\") pod \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " Oct 01 10:50:29 crc kubenswrapper[4735]: I1001 10:50:29.969977 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-nova-metadata-neutron-config-0\") pod \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " Oct 01 10:50:29 crc kubenswrapper[4735]: I1001 10:50:29.970682 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btsgc\" (UniqueName: \"kubernetes.io/projected/9d75fc81-9126-4b8e-b623-47a8c65adb8f-kube-api-access-btsgc\") pod \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " Oct 01 10:50:29 crc kubenswrapper[4735]: I1001 10:50:29.971008 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-ssh-key\") pod \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " Oct 01 10:50:29 crc kubenswrapper[4735]: I1001 10:50:29.971219 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-metadata-combined-ca-bundle\") pod \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " Oct 01 10:50:29 crc kubenswrapper[4735]: I1001 10:50:29.971541 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\" (UID: \"9d75fc81-9126-4b8e-b623-47a8c65adb8f\") " Oct 01 10:50:29 crc kubenswrapper[4735]: I1001 10:50:29.978033 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9d75fc81-9126-4b8e-b623-47a8c65adb8f" (UID: "9d75fc81-9126-4b8e-b623-47a8c65adb8f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:50:29 crc kubenswrapper[4735]: I1001 10:50:29.981068 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d75fc81-9126-4b8e-b623-47a8c65adb8f-kube-api-access-btsgc" (OuterVolumeSpecName: "kube-api-access-btsgc") pod "9d75fc81-9126-4b8e-b623-47a8c65adb8f" (UID: "9d75fc81-9126-4b8e-b623-47a8c65adb8f"). InnerVolumeSpecName "kube-api-access-btsgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.008191 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9d75fc81-9126-4b8e-b623-47a8c65adb8f" (UID: "9d75fc81-9126-4b8e-b623-47a8c65adb8f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.014094 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-inventory" (OuterVolumeSpecName: "inventory") pod "9d75fc81-9126-4b8e-b623-47a8c65adb8f" (UID: "9d75fc81-9126-4b8e-b623-47a8c65adb8f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.028269 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9d75fc81-9126-4b8e-b623-47a8c65adb8f" (UID: "9d75fc81-9126-4b8e-b623-47a8c65adb8f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.034298 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9d75fc81-9126-4b8e-b623-47a8c65adb8f" (UID: "9d75fc81-9126-4b8e-b623-47a8c65adb8f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.076833 4735 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.076869 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btsgc\" (UniqueName: \"kubernetes.io/projected/9d75fc81-9126-4b8e-b623-47a8c65adb8f-kube-api-access-btsgc\") on node \"crc\" DevicePath \"\"" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.076880 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.076892 4735 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.076903 4735 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.076914 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d75fc81-9126-4b8e-b623-47a8c65adb8f-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.359329 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.359716 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw" event={"ID":"9d75fc81-9126-4b8e-b623-47a8c65adb8f","Type":"ContainerDied","Data":"4597361e1f1b12ee2bd917765022a76a3717bbd224fd2110e153241065db0897"} Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.359870 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4597361e1f1b12ee2bd917765022a76a3717bbd224fd2110e153241065db0897" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.481696 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs"] Oct 01 10:50:30 crc kubenswrapper[4735]: E1001 10:50:30.482142 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d75fc81-9126-4b8e-b623-47a8c65adb8f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.482164 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d75fc81-9126-4b8e-b623-47a8c65adb8f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.482434 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d75fc81-9126-4b8e-b623-47a8c65adb8f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.483279 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.485754 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.486893 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.488175 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.488460 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.488554 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.498015 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs"] Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.584626 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.584676 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.584748 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.584781 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.584813 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbf8b\" (UniqueName: \"kubernetes.io/projected/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-kube-api-access-xbf8b\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.685946 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.686020 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.686120 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.686178 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.686232 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbf8b\" (UniqueName: \"kubernetes.io/projected/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-kube-api-access-xbf8b\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.692339 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.693951 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.694271 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.698267 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.704231 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbf8b\" (UniqueName: \"kubernetes.io/projected/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-kube-api-access-xbf8b\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:30 crc kubenswrapper[4735]: I1001 10:50:30.810506 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:50:31 crc kubenswrapper[4735]: I1001 10:50:31.422859 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs"] Oct 01 10:50:32 crc kubenswrapper[4735]: I1001 10:50:32.381290 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" event={"ID":"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7","Type":"ContainerStarted","Data":"67d1d4edd39d5283567a3dec046c2812c3dbf26e0de407b8092a4889f106c051"} Oct 01 10:50:33 crc kubenswrapper[4735]: I1001 10:50:33.394206 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" event={"ID":"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7","Type":"ContainerStarted","Data":"2d220c46ae0c6e564dbc7ea2d409a43a5f8e7e09267225eca8a12088a350094f"} Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.643735 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" podStartSLOduration=14.884536816 podStartE2EDuration="15.643714773s" podCreationTimestamp="2025-10-01 10:50:30 +0000 UTC" firstStartedPulling="2025-10-01 10:50:31.439093219 +0000 UTC m=+1990.131914481" lastFinishedPulling="2025-10-01 10:50:32.198271166 +0000 UTC m=+1990.891092438" observedRunningTime="2025-10-01 10:50:33.423179281 +0000 UTC m=+1992.116000543" watchObservedRunningTime="2025-10-01 10:50:45.643714773 +0000 UTC m=+2004.336536045" Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.644516 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fptl6"] Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.647642 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.656368 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fptl6"] Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.691223 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-utilities\") pod \"community-operators-fptl6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.691309 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-catalog-content\") pod \"community-operators-fptl6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.691468 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9jr6\" (UniqueName: \"kubernetes.io/projected/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-kube-api-access-k9jr6\") pod \"community-operators-fptl6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.793981 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-utilities\") pod \"community-operators-fptl6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.794055 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-catalog-content\") pod \"community-operators-fptl6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.794177 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9jr6\" (UniqueName: \"kubernetes.io/projected/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-kube-api-access-k9jr6\") pod \"community-operators-fptl6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.794691 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-utilities\") pod \"community-operators-fptl6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.794769 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-catalog-content\") pod \"community-operators-fptl6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.817729 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9jr6\" (UniqueName: \"kubernetes.io/projected/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-kube-api-access-k9jr6\") pod \"community-operators-fptl6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:45 crc kubenswrapper[4735]: I1001 10:50:45.979149 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:46 crc kubenswrapper[4735]: I1001 10:50:46.486251 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fptl6"] Oct 01 10:50:46 crc kubenswrapper[4735]: W1001 10:50:46.490690 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod195c68f5_3e55_4a4f_bb2f_29c18606c0c6.slice/crio-a6883040f0df1d59a742af9a2e355c8c48c1d753807bcc87a49873c9f98c6a24 WatchSource:0}: Error finding container a6883040f0df1d59a742af9a2e355c8c48c1d753807bcc87a49873c9f98c6a24: Status 404 returned error can't find the container with id a6883040f0df1d59a742af9a2e355c8c48c1d753807bcc87a49873c9f98c6a24 Oct 01 10:50:46 crc kubenswrapper[4735]: I1001 10:50:46.535581 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fptl6" event={"ID":"195c68f5-3e55-4a4f-bb2f-29c18606c0c6","Type":"ContainerStarted","Data":"a6883040f0df1d59a742af9a2e355c8c48c1d753807bcc87a49873c9f98c6a24"} Oct 01 10:50:47 crc kubenswrapper[4735]: I1001 10:50:47.548212 4735 generic.go:334] "Generic (PLEG): container finished" podID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" containerID="cdd9c4723eddaf44483960a070c247ddfda29abeabefa10d727001f7aa6a99d9" exitCode=0 Oct 01 10:50:47 crc kubenswrapper[4735]: I1001 10:50:47.548434 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fptl6" event={"ID":"195c68f5-3e55-4a4f-bb2f-29c18606c0c6","Type":"ContainerDied","Data":"cdd9c4723eddaf44483960a070c247ddfda29abeabefa10d727001f7aa6a99d9"} Oct 01 10:50:48 crc kubenswrapper[4735]: I1001 10:50:48.565961 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fptl6" event={"ID":"195c68f5-3e55-4a4f-bb2f-29c18606c0c6","Type":"ContainerStarted","Data":"796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0"} Oct 01 10:50:49 crc kubenswrapper[4735]: I1001 10:50:49.578194 4735 generic.go:334] "Generic (PLEG): container finished" podID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" containerID="796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0" exitCode=0 Oct 01 10:50:49 crc kubenswrapper[4735]: I1001 10:50:49.578240 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fptl6" event={"ID":"195c68f5-3e55-4a4f-bb2f-29c18606c0c6","Type":"ContainerDied","Data":"796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0"} Oct 01 10:50:51 crc kubenswrapper[4735]: I1001 10:50:51.598019 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fptl6" event={"ID":"195c68f5-3e55-4a4f-bb2f-29c18606c0c6","Type":"ContainerStarted","Data":"13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f"} Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.001069 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fptl6" podStartSLOduration=5.085362513 podStartE2EDuration="8.001045915s" podCreationTimestamp="2025-10-01 10:50:45 +0000 UTC" firstStartedPulling="2025-10-01 10:50:47.551440889 +0000 UTC m=+2006.244262151" lastFinishedPulling="2025-10-01 10:50:50.467124291 +0000 UTC m=+2009.159945553" observedRunningTime="2025-10-01 10:50:51.613023079 +0000 UTC m=+2010.305844351" watchObservedRunningTime="2025-10-01 10:50:53.001045915 +0000 UTC m=+2011.693867177" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.004430 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qbqfl"] Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.009903 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.014449 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qbqfl"] Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.142807 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkkhg\" (UniqueName: \"kubernetes.io/projected/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-kube-api-access-pkkhg\") pod \"redhat-marketplace-qbqfl\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.143168 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-utilities\") pod \"redhat-marketplace-qbqfl\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.143311 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-catalog-content\") pod \"redhat-marketplace-qbqfl\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.245267 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkkhg\" (UniqueName: \"kubernetes.io/projected/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-kube-api-access-pkkhg\") pod \"redhat-marketplace-qbqfl\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.245437 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-utilities\") pod \"redhat-marketplace-qbqfl\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.245518 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-catalog-content\") pod \"redhat-marketplace-qbqfl\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.245966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-utilities\") pod \"redhat-marketplace-qbqfl\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.246008 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-catalog-content\") pod \"redhat-marketplace-qbqfl\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.271908 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkkhg\" (UniqueName: \"kubernetes.io/projected/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-kube-api-access-pkkhg\") pod \"redhat-marketplace-qbqfl\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.338156 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:50:53 crc kubenswrapper[4735]: I1001 10:50:53.765190 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qbqfl"] Oct 01 10:50:53 crc kubenswrapper[4735]: W1001 10:50:53.777899 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f2884c7_a0a2_4a2e_90a3_38b3596ca92e.slice/crio-253736090e390c0e7d0cc10a0c11b2e4ee19b3a7e2a219d5ee62249561a54526 WatchSource:0}: Error finding container 253736090e390c0e7d0cc10a0c11b2e4ee19b3a7e2a219d5ee62249561a54526: Status 404 returned error can't find the container with id 253736090e390c0e7d0cc10a0c11b2e4ee19b3a7e2a219d5ee62249561a54526 Oct 01 10:50:54 crc kubenswrapper[4735]: I1001 10:50:54.622484 4735 generic.go:334] "Generic (PLEG): container finished" podID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" containerID="6b96301cbe4119391297975b678034bde68d24615fa0942e35c041cfacc92f4d" exitCode=0 Oct 01 10:50:54 crc kubenswrapper[4735]: I1001 10:50:54.622621 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbqfl" event={"ID":"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e","Type":"ContainerDied","Data":"6b96301cbe4119391297975b678034bde68d24615fa0942e35c041cfacc92f4d"} Oct 01 10:50:54 crc kubenswrapper[4735]: I1001 10:50:54.622805 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbqfl" event={"ID":"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e","Type":"ContainerStarted","Data":"253736090e390c0e7d0cc10a0c11b2e4ee19b3a7e2a219d5ee62249561a54526"} Oct 01 10:50:55 crc kubenswrapper[4735]: I1001 10:50:55.632538 4735 generic.go:334] "Generic (PLEG): container finished" podID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" containerID="d016c71c4a8ea4db23f26f0c1f40d3a8c591518f24abce67681fedd585eb0558" exitCode=0 Oct 01 10:50:55 crc kubenswrapper[4735]: I1001 10:50:55.632643 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbqfl" event={"ID":"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e","Type":"ContainerDied","Data":"d016c71c4a8ea4db23f26f0c1f40d3a8c591518f24abce67681fedd585eb0558"} Oct 01 10:50:55 crc kubenswrapper[4735]: I1001 10:50:55.979950 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:55 crc kubenswrapper[4735]: I1001 10:50:55.980291 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:56 crc kubenswrapper[4735]: I1001 10:50:56.026808 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:56 crc kubenswrapper[4735]: I1001 10:50:56.695252 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:57 crc kubenswrapper[4735]: I1001 10:50:57.652412 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbqfl" event={"ID":"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e","Type":"ContainerStarted","Data":"f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec"} Oct 01 10:50:57 crc kubenswrapper[4735]: I1001 10:50:57.674972 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qbqfl" podStartSLOduration=3.647356695 podStartE2EDuration="5.674953359s" podCreationTimestamp="2025-10-01 10:50:52 +0000 UTC" firstStartedPulling="2025-10-01 10:50:54.625612969 +0000 UTC m=+2013.318434231" lastFinishedPulling="2025-10-01 10:50:56.653209623 +0000 UTC m=+2015.346030895" observedRunningTime="2025-10-01 10:50:57.671031593 +0000 UTC m=+2016.363852855" watchObservedRunningTime="2025-10-01 10:50:57.674953359 +0000 UTC m=+2016.367774621" Oct 01 10:50:58 crc kubenswrapper[4735]: I1001 10:50:58.403636 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fptl6"] Oct 01 10:50:58 crc kubenswrapper[4735]: I1001 10:50:58.663227 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fptl6" podUID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" containerName="registry-server" containerID="cri-o://13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f" gracePeriod=2 Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.272509 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.360210 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9jr6\" (UniqueName: \"kubernetes.io/projected/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-kube-api-access-k9jr6\") pod \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.360267 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-utilities\") pod \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.360374 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-catalog-content\") pod \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\" (UID: \"195c68f5-3e55-4a4f-bb2f-29c18606c0c6\") " Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.361437 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-utilities" (OuterVolumeSpecName: "utilities") pod "195c68f5-3e55-4a4f-bb2f-29c18606c0c6" (UID: "195c68f5-3e55-4a4f-bb2f-29c18606c0c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.366250 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-kube-api-access-k9jr6" (OuterVolumeSpecName: "kube-api-access-k9jr6") pod "195c68f5-3e55-4a4f-bb2f-29c18606c0c6" (UID: "195c68f5-3e55-4a4f-bb2f-29c18606c0c6"). InnerVolumeSpecName "kube-api-access-k9jr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.407894 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "195c68f5-3e55-4a4f-bb2f-29c18606c0c6" (UID: "195c68f5-3e55-4a4f-bb2f-29c18606c0c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.463369 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.463412 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9jr6\" (UniqueName: \"kubernetes.io/projected/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-kube-api-access-k9jr6\") on node \"crc\" DevicePath \"\"" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.463430 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/195c68f5-3e55-4a4f-bb2f-29c18606c0c6-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.689030 4735 generic.go:334] "Generic (PLEG): container finished" podID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" containerID="13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f" exitCode=0 Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.689075 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fptl6" event={"ID":"195c68f5-3e55-4a4f-bb2f-29c18606c0c6","Type":"ContainerDied","Data":"13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f"} Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.689101 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fptl6" event={"ID":"195c68f5-3e55-4a4f-bb2f-29c18606c0c6","Type":"ContainerDied","Data":"a6883040f0df1d59a742af9a2e355c8c48c1d753807bcc87a49873c9f98c6a24"} Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.689117 4735 scope.go:117] "RemoveContainer" containerID="13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.689142 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fptl6" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.707575 4735 scope.go:117] "RemoveContainer" containerID="796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.745154 4735 scope.go:117] "RemoveContainer" containerID="cdd9c4723eddaf44483960a070c247ddfda29abeabefa10d727001f7aa6a99d9" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.750368 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fptl6"] Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.758844 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fptl6"] Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.804752 4735 scope.go:117] "RemoveContainer" containerID="13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f" Oct 01 10:50:59 crc kubenswrapper[4735]: E1001 10:50:59.805187 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f\": container with ID starting with 13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f not found: ID does not exist" containerID="13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.805231 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f"} err="failed to get container status \"13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f\": rpc error: code = NotFound desc = could not find container \"13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f\": container with ID starting with 13d898f53e8f42a39d12f4c6bfae1ad97607efb0e859ff0d92036d0a8410f60f not found: ID does not exist" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.805262 4735 scope.go:117] "RemoveContainer" containerID="796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0" Oct 01 10:50:59 crc kubenswrapper[4735]: E1001 10:50:59.805611 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0\": container with ID starting with 796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0 not found: ID does not exist" containerID="796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.805646 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0"} err="failed to get container status \"796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0\": rpc error: code = NotFound desc = could not find container \"796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0\": container with ID starting with 796658d35a4f7d4a7a34368f25193ba368be41ef65a723590c657e175009eaf0 not found: ID does not exist" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.805674 4735 scope.go:117] "RemoveContainer" containerID="cdd9c4723eddaf44483960a070c247ddfda29abeabefa10d727001f7aa6a99d9" Oct 01 10:50:59 crc kubenswrapper[4735]: E1001 10:50:59.805975 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdd9c4723eddaf44483960a070c247ddfda29abeabefa10d727001f7aa6a99d9\": container with ID starting with cdd9c4723eddaf44483960a070c247ddfda29abeabefa10d727001f7aa6a99d9 not found: ID does not exist" containerID="cdd9c4723eddaf44483960a070c247ddfda29abeabefa10d727001f7aa6a99d9" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.806006 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd9c4723eddaf44483960a070c247ddfda29abeabefa10d727001f7aa6a99d9"} err="failed to get container status \"cdd9c4723eddaf44483960a070c247ddfda29abeabefa10d727001f7aa6a99d9\": rpc error: code = NotFound desc = could not find container \"cdd9c4723eddaf44483960a070c247ddfda29abeabefa10d727001f7aa6a99d9\": container with ID starting with cdd9c4723eddaf44483960a070c247ddfda29abeabefa10d727001f7aa6a99d9 not found: ID does not exist" Oct 01 10:50:59 crc kubenswrapper[4735]: I1001 10:50:59.908620 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" path="/var/lib/kubelet/pods/195c68f5-3e55-4a4f-bb2f-29c18606c0c6/volumes" Oct 01 10:51:03 crc kubenswrapper[4735]: I1001 10:51:03.338401 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:51:03 crc kubenswrapper[4735]: I1001 10:51:03.339026 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:51:03 crc kubenswrapper[4735]: I1001 10:51:03.408890 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:51:03 crc kubenswrapper[4735]: I1001 10:51:03.806532 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:51:03 crc kubenswrapper[4735]: I1001 10:51:03.864268 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qbqfl"] Oct 01 10:51:05 crc kubenswrapper[4735]: I1001 10:51:05.486020 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:51:05 crc kubenswrapper[4735]: I1001 10:51:05.486093 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:51:05 crc kubenswrapper[4735]: I1001 10:51:05.740737 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qbqfl" podUID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" containerName="registry-server" containerID="cri-o://f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec" gracePeriod=2 Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.206353 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.293002 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-utilities\") pod \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.293068 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-catalog-content\") pod \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.293260 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkkhg\" (UniqueName: \"kubernetes.io/projected/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-kube-api-access-pkkhg\") pod \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\" (UID: \"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e\") " Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.294390 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-utilities" (OuterVolumeSpecName: "utilities") pod "3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" (UID: "3f2884c7-a0a2-4a2e-90a3-38b3596ca92e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.300129 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-kube-api-access-pkkhg" (OuterVolumeSpecName: "kube-api-access-pkkhg") pod "3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" (UID: "3f2884c7-a0a2-4a2e-90a3-38b3596ca92e"). InnerVolumeSpecName "kube-api-access-pkkhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.397336 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.397679 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkkhg\" (UniqueName: \"kubernetes.io/projected/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-kube-api-access-pkkhg\") on node \"crc\" DevicePath \"\"" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.405542 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" (UID: "3f2884c7-a0a2-4a2e-90a3-38b3596ca92e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.499619 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.751977 4735 generic.go:334] "Generic (PLEG): container finished" podID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" containerID="f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec" exitCode=0 Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.752031 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbqfl" event={"ID":"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e","Type":"ContainerDied","Data":"f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec"} Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.752086 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbqfl" event={"ID":"3f2884c7-a0a2-4a2e-90a3-38b3596ca92e","Type":"ContainerDied","Data":"253736090e390c0e7d0cc10a0c11b2e4ee19b3a7e2a219d5ee62249561a54526"} Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.752111 4735 scope.go:117] "RemoveContainer" containerID="f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.752386 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qbqfl" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.789296 4735 scope.go:117] "RemoveContainer" containerID="d016c71c4a8ea4db23f26f0c1f40d3a8c591518f24abce67681fedd585eb0558" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.790819 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qbqfl"] Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.799068 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qbqfl"] Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.809274 4735 scope.go:117] "RemoveContainer" containerID="6b96301cbe4119391297975b678034bde68d24615fa0942e35c041cfacc92f4d" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.849853 4735 scope.go:117] "RemoveContainer" containerID="f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec" Oct 01 10:51:06 crc kubenswrapper[4735]: E1001 10:51:06.850291 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec\": container with ID starting with f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec not found: ID does not exist" containerID="f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.850325 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec"} err="failed to get container status \"f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec\": rpc error: code = NotFound desc = could not find container \"f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec\": container with ID starting with f34a9ea9363f4d85a73cf2ae496388eb278ad01e9ea90fb7fa9c641fc2931aec not found: ID does not exist" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.850345 4735 scope.go:117] "RemoveContainer" containerID="d016c71c4a8ea4db23f26f0c1f40d3a8c591518f24abce67681fedd585eb0558" Oct 01 10:51:06 crc kubenswrapper[4735]: E1001 10:51:06.850735 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d016c71c4a8ea4db23f26f0c1f40d3a8c591518f24abce67681fedd585eb0558\": container with ID starting with d016c71c4a8ea4db23f26f0c1f40d3a8c591518f24abce67681fedd585eb0558 not found: ID does not exist" containerID="d016c71c4a8ea4db23f26f0c1f40d3a8c591518f24abce67681fedd585eb0558" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.850755 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d016c71c4a8ea4db23f26f0c1f40d3a8c591518f24abce67681fedd585eb0558"} err="failed to get container status \"d016c71c4a8ea4db23f26f0c1f40d3a8c591518f24abce67681fedd585eb0558\": rpc error: code = NotFound desc = could not find container \"d016c71c4a8ea4db23f26f0c1f40d3a8c591518f24abce67681fedd585eb0558\": container with ID starting with d016c71c4a8ea4db23f26f0c1f40d3a8c591518f24abce67681fedd585eb0558 not found: ID does not exist" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.850771 4735 scope.go:117] "RemoveContainer" containerID="6b96301cbe4119391297975b678034bde68d24615fa0942e35c041cfacc92f4d" Oct 01 10:51:06 crc kubenswrapper[4735]: E1001 10:51:06.851130 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b96301cbe4119391297975b678034bde68d24615fa0942e35c041cfacc92f4d\": container with ID starting with 6b96301cbe4119391297975b678034bde68d24615fa0942e35c041cfacc92f4d not found: ID does not exist" containerID="6b96301cbe4119391297975b678034bde68d24615fa0942e35c041cfacc92f4d" Oct 01 10:51:06 crc kubenswrapper[4735]: I1001 10:51:06.851153 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b96301cbe4119391297975b678034bde68d24615fa0942e35c041cfacc92f4d"} err="failed to get container status \"6b96301cbe4119391297975b678034bde68d24615fa0942e35c041cfacc92f4d\": rpc error: code = NotFound desc = could not find container \"6b96301cbe4119391297975b678034bde68d24615fa0942e35c041cfacc92f4d\": container with ID starting with 6b96301cbe4119391297975b678034bde68d24615fa0942e35c041cfacc92f4d not found: ID does not exist" Oct 01 10:51:07 crc kubenswrapper[4735]: I1001 10:51:07.907885 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" path="/var/lib/kubelet/pods/3f2884c7-a0a2-4a2e-90a3-38b3596ca92e/volumes" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.717728 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b6mjh"] Oct 01 10:51:14 crc kubenswrapper[4735]: E1001 10:51:14.720335 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" containerName="extract-content" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.720356 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" containerName="extract-content" Oct 01 10:51:14 crc kubenswrapper[4735]: E1001 10:51:14.720370 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" containerName="registry-server" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.720376 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" containerName="registry-server" Oct 01 10:51:14 crc kubenswrapper[4735]: E1001 10:51:14.720394 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" containerName="registry-server" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.720400 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" containerName="registry-server" Oct 01 10:51:14 crc kubenswrapper[4735]: E1001 10:51:14.720415 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" containerName="extract-content" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.720421 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" containerName="extract-content" Oct 01 10:51:14 crc kubenswrapper[4735]: E1001 10:51:14.720436 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" containerName="extract-utilities" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.720442 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" containerName="extract-utilities" Oct 01 10:51:14 crc kubenswrapper[4735]: E1001 10:51:14.720460 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" containerName="extract-utilities" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.720466 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" containerName="extract-utilities" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.720670 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f2884c7-a0a2-4a2e-90a3-38b3596ca92e" containerName="registry-server" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.720688 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="195c68f5-3e55-4a4f-bb2f-29c18606c0c6" containerName="registry-server" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.721984 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.726539 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6mjh"] Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.796279 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-utilities\") pod \"certified-operators-b6mjh\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.796370 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dx2w\" (UniqueName: \"kubernetes.io/projected/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-kube-api-access-4dx2w\") pod \"certified-operators-b6mjh\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.796403 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-catalog-content\") pod \"certified-operators-b6mjh\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.897733 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-utilities\") pod \"certified-operators-b6mjh\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.897821 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dx2w\" (UniqueName: \"kubernetes.io/projected/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-kube-api-access-4dx2w\") pod \"certified-operators-b6mjh\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.897842 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-catalog-content\") pod \"certified-operators-b6mjh\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.898377 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-catalog-content\") pod \"certified-operators-b6mjh\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.898461 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-utilities\") pod \"certified-operators-b6mjh\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:14 crc kubenswrapper[4735]: I1001 10:51:14.922022 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dx2w\" (UniqueName: \"kubernetes.io/projected/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-kube-api-access-4dx2w\") pod \"certified-operators-b6mjh\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:15 crc kubenswrapper[4735]: I1001 10:51:15.051607 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:15 crc kubenswrapper[4735]: I1001 10:51:15.581154 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6mjh"] Oct 01 10:51:15 crc kubenswrapper[4735]: I1001 10:51:15.843621 4735 generic.go:334] "Generic (PLEG): container finished" podID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerID="e3924ce81c6baa9a870ebc62b87fd96caa5921a35647d691f78cce90618fb995" exitCode=0 Oct 01 10:51:15 crc kubenswrapper[4735]: I1001 10:51:15.843674 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6mjh" event={"ID":"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf","Type":"ContainerDied","Data":"e3924ce81c6baa9a870ebc62b87fd96caa5921a35647d691f78cce90618fb995"} Oct 01 10:51:15 crc kubenswrapper[4735]: I1001 10:51:15.843956 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6mjh" event={"ID":"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf","Type":"ContainerStarted","Data":"9e619f7d3dd58a5a462773c90ed5acee9d3dea034dd0b6119456e22b40ede7fc"} Oct 01 10:51:18 crc kubenswrapper[4735]: I1001 10:51:18.881482 4735 generic.go:334] "Generic (PLEG): container finished" podID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerID="c8e4bf76245bbe8119f17607723abc2ee668631aed82d1e37cd0ba7b480baf03" exitCode=0 Oct 01 10:51:18 crc kubenswrapper[4735]: I1001 10:51:18.881630 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6mjh" event={"ID":"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf","Type":"ContainerDied","Data":"c8e4bf76245bbe8119f17607723abc2ee668631aed82d1e37cd0ba7b480baf03"} Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.505282 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v5rcl"] Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.510373 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.532284 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v5rcl"] Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.602127 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-utilities\") pod \"redhat-operators-v5rcl\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.602283 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-catalog-content\") pod \"redhat-operators-v5rcl\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.602375 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrc5q\" (UniqueName: \"kubernetes.io/projected/eb2b7c02-f428-4eb6-8775-8cef8429e66b-kube-api-access-xrc5q\") pod \"redhat-operators-v5rcl\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.703856 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-catalog-content\") pod \"redhat-operators-v5rcl\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.704036 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrc5q\" (UniqueName: \"kubernetes.io/projected/eb2b7c02-f428-4eb6-8775-8cef8429e66b-kube-api-access-xrc5q\") pod \"redhat-operators-v5rcl\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.704113 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-utilities\") pod \"redhat-operators-v5rcl\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.704432 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-catalog-content\") pod \"redhat-operators-v5rcl\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.704445 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-utilities\") pod \"redhat-operators-v5rcl\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.730073 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrc5q\" (UniqueName: \"kubernetes.io/projected/eb2b7c02-f428-4eb6-8775-8cef8429e66b-kube-api-access-xrc5q\") pod \"redhat-operators-v5rcl\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:19 crc kubenswrapper[4735]: I1001 10:51:19.831999 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:20 crc kubenswrapper[4735]: I1001 10:51:20.304388 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v5rcl"] Oct 01 10:51:20 crc kubenswrapper[4735]: W1001 10:51:20.309570 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb2b7c02_f428_4eb6_8775_8cef8429e66b.slice/crio-e919977f3d95d52f2f0d8f71bd12821ebdfd7cf99f4ca545af788318480cc89b WatchSource:0}: Error finding container e919977f3d95d52f2f0d8f71bd12821ebdfd7cf99f4ca545af788318480cc89b: Status 404 returned error can't find the container with id e919977f3d95d52f2f0d8f71bd12821ebdfd7cf99f4ca545af788318480cc89b Oct 01 10:51:20 crc kubenswrapper[4735]: I1001 10:51:20.902010 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5rcl" event={"ID":"eb2b7c02-f428-4eb6-8775-8cef8429e66b","Type":"ContainerStarted","Data":"e919977f3d95d52f2f0d8f71bd12821ebdfd7cf99f4ca545af788318480cc89b"} Oct 01 10:51:21 crc kubenswrapper[4735]: I1001 10:51:21.944851 4735 generic.go:334] "Generic (PLEG): container finished" podID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerID="74bcde99a37257db75be90a4c3cd91b532e5807266a2602ccf9627829077c9ab" exitCode=0 Oct 01 10:51:21 crc kubenswrapper[4735]: I1001 10:51:21.947412 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5rcl" event={"ID":"eb2b7c02-f428-4eb6-8775-8cef8429e66b","Type":"ContainerDied","Data":"74bcde99a37257db75be90a4c3cd91b532e5807266a2602ccf9627829077c9ab"} Oct 01 10:51:25 crc kubenswrapper[4735]: I1001 10:51:25.014888 4735 generic.go:334] "Generic (PLEG): container finished" podID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerID="3b892661872338ecdb9faeb2fb2dda77ec9a1f8a0484eca41e9021c41e8156f1" exitCode=0 Oct 01 10:51:25 crc kubenswrapper[4735]: I1001 10:51:25.016706 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5rcl" event={"ID":"eb2b7c02-f428-4eb6-8775-8cef8429e66b","Type":"ContainerDied","Data":"3b892661872338ecdb9faeb2fb2dda77ec9a1f8a0484eca41e9021c41e8156f1"} Oct 01 10:51:25 crc kubenswrapper[4735]: I1001 10:51:25.020394 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6mjh" event={"ID":"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf","Type":"ContainerStarted","Data":"033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1"} Oct 01 10:51:25 crc kubenswrapper[4735]: I1001 10:51:25.052545 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:25 crc kubenswrapper[4735]: I1001 10:51:25.052907 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:25 crc kubenswrapper[4735]: I1001 10:51:25.055767 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b6mjh" podStartSLOduration=2.743712717 podStartE2EDuration="11.055744839s" podCreationTimestamp="2025-10-01 10:51:14 +0000 UTC" firstStartedPulling="2025-10-01 10:51:15.849025972 +0000 UTC m=+2034.541847234" lastFinishedPulling="2025-10-01 10:51:24.161058094 +0000 UTC m=+2042.853879356" observedRunningTime="2025-10-01 10:51:25.050338405 +0000 UTC m=+2043.743159677" watchObservedRunningTime="2025-10-01 10:51:25.055744839 +0000 UTC m=+2043.748566101" Oct 01 10:51:26 crc kubenswrapper[4735]: I1001 10:51:26.100587 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-b6mjh" podUID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerName="registry-server" probeResult="failure" output=< Oct 01 10:51:26 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 01 10:51:26 crc kubenswrapper[4735]: > Oct 01 10:51:27 crc kubenswrapper[4735]: I1001 10:51:27.047181 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5rcl" event={"ID":"eb2b7c02-f428-4eb6-8775-8cef8429e66b","Type":"ContainerStarted","Data":"b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767"} Oct 01 10:51:28 crc kubenswrapper[4735]: I1001 10:51:28.077420 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v5rcl" podStartSLOduration=4.749048972 podStartE2EDuration="9.077397454s" podCreationTimestamp="2025-10-01 10:51:19 +0000 UTC" firstStartedPulling="2025-10-01 10:51:21.948423659 +0000 UTC m=+2040.641244931" lastFinishedPulling="2025-10-01 10:51:26.276772151 +0000 UTC m=+2044.969593413" observedRunningTime="2025-10-01 10:51:28.072354378 +0000 UTC m=+2046.765175650" watchObservedRunningTime="2025-10-01 10:51:28.077397454 +0000 UTC m=+2046.770218716" Oct 01 10:51:29 crc kubenswrapper[4735]: I1001 10:51:29.833675 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:29 crc kubenswrapper[4735]: I1001 10:51:29.833966 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:30 crc kubenswrapper[4735]: I1001 10:51:30.880431 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v5rcl" podUID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerName="registry-server" probeResult="failure" output=< Oct 01 10:51:30 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 01 10:51:30 crc kubenswrapper[4735]: > Oct 01 10:51:35 crc kubenswrapper[4735]: I1001 10:51:35.136830 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:35 crc kubenswrapper[4735]: I1001 10:51:35.228284 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:35 crc kubenswrapper[4735]: I1001 10:51:35.399469 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6mjh"] Oct 01 10:51:35 crc kubenswrapper[4735]: I1001 10:51:35.485393 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:51:35 crc kubenswrapper[4735]: I1001 10:51:35.485537 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:51:37 crc kubenswrapper[4735]: I1001 10:51:37.153673 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b6mjh" podUID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerName="registry-server" containerID="cri-o://033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1" gracePeriod=2 Oct 01 10:51:37 crc kubenswrapper[4735]: I1001 10:51:37.642206 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:37 crc kubenswrapper[4735]: I1001 10:51:37.775558 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-catalog-content\") pod \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " Oct 01 10:51:37 crc kubenswrapper[4735]: I1001 10:51:37.776082 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dx2w\" (UniqueName: \"kubernetes.io/projected/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-kube-api-access-4dx2w\") pod \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " Oct 01 10:51:37 crc kubenswrapper[4735]: I1001 10:51:37.776460 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-utilities\") pod \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\" (UID: \"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf\") " Oct 01 10:51:37 crc kubenswrapper[4735]: I1001 10:51:37.777300 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-utilities" (OuterVolumeSpecName: "utilities") pod "174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" (UID: "174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:51:37 crc kubenswrapper[4735]: I1001 10:51:37.777738 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:51:37 crc kubenswrapper[4735]: I1001 10:51:37.783963 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-kube-api-access-4dx2w" (OuterVolumeSpecName: "kube-api-access-4dx2w") pod "174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" (UID: "174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf"). InnerVolumeSpecName "kube-api-access-4dx2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:51:37 crc kubenswrapper[4735]: I1001 10:51:37.830028 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" (UID: "174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:51:37 crc kubenswrapper[4735]: I1001 10:51:37.879532 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:51:37 crc kubenswrapper[4735]: I1001 10:51:37.879576 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dx2w\" (UniqueName: \"kubernetes.io/projected/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf-kube-api-access-4dx2w\") on node \"crc\" DevicePath \"\"" Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.175194 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6mjh" event={"ID":"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf","Type":"ContainerDied","Data":"033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1"} Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.175284 4735 scope.go:117] "RemoveContainer" containerID="033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1" Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.175296 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6mjh" Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.175204 4735 generic.go:334] "Generic (PLEG): container finished" podID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerID="033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1" exitCode=0 Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.175485 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6mjh" event={"ID":"174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf","Type":"ContainerDied","Data":"9e619f7d3dd58a5a462773c90ed5acee9d3dea034dd0b6119456e22b40ede7fc"} Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.208047 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6mjh"] Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.209012 4735 scope.go:117] "RemoveContainer" containerID="c8e4bf76245bbe8119f17607723abc2ee668631aed82d1e37cd0ba7b480baf03" Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.216061 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b6mjh"] Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.229903 4735 scope.go:117] "RemoveContainer" containerID="e3924ce81c6baa9a870ebc62b87fd96caa5921a35647d691f78cce90618fb995" Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.270413 4735 scope.go:117] "RemoveContainer" containerID="033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1" Oct 01 10:51:38 crc kubenswrapper[4735]: E1001 10:51:38.270893 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1\": container with ID starting with 033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1 not found: ID does not exist" containerID="033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1" Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.270924 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1"} err="failed to get container status \"033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1\": rpc error: code = NotFound desc = could not find container \"033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1\": container with ID starting with 033417a201fd65b77a1dad1124095017b7b34c2723f9f89727ae86b87f9cbfe1 not found: ID does not exist" Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.270943 4735 scope.go:117] "RemoveContainer" containerID="c8e4bf76245bbe8119f17607723abc2ee668631aed82d1e37cd0ba7b480baf03" Oct 01 10:51:38 crc kubenswrapper[4735]: E1001 10:51:38.271209 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8e4bf76245bbe8119f17607723abc2ee668631aed82d1e37cd0ba7b480baf03\": container with ID starting with c8e4bf76245bbe8119f17607723abc2ee668631aed82d1e37cd0ba7b480baf03 not found: ID does not exist" containerID="c8e4bf76245bbe8119f17607723abc2ee668631aed82d1e37cd0ba7b480baf03" Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.271259 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8e4bf76245bbe8119f17607723abc2ee668631aed82d1e37cd0ba7b480baf03"} err="failed to get container status \"c8e4bf76245bbe8119f17607723abc2ee668631aed82d1e37cd0ba7b480baf03\": rpc error: code = NotFound desc = could not find container \"c8e4bf76245bbe8119f17607723abc2ee668631aed82d1e37cd0ba7b480baf03\": container with ID starting with c8e4bf76245bbe8119f17607723abc2ee668631aed82d1e37cd0ba7b480baf03 not found: ID does not exist" Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.271292 4735 scope.go:117] "RemoveContainer" containerID="e3924ce81c6baa9a870ebc62b87fd96caa5921a35647d691f78cce90618fb995" Oct 01 10:51:38 crc kubenswrapper[4735]: E1001 10:51:38.271630 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3924ce81c6baa9a870ebc62b87fd96caa5921a35647d691f78cce90618fb995\": container with ID starting with e3924ce81c6baa9a870ebc62b87fd96caa5921a35647d691f78cce90618fb995 not found: ID does not exist" containerID="e3924ce81c6baa9a870ebc62b87fd96caa5921a35647d691f78cce90618fb995" Oct 01 10:51:38 crc kubenswrapper[4735]: I1001 10:51:38.271655 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3924ce81c6baa9a870ebc62b87fd96caa5921a35647d691f78cce90618fb995"} err="failed to get container status \"e3924ce81c6baa9a870ebc62b87fd96caa5921a35647d691f78cce90618fb995\": rpc error: code = NotFound desc = could not find container \"e3924ce81c6baa9a870ebc62b87fd96caa5921a35647d691f78cce90618fb995\": container with ID starting with e3924ce81c6baa9a870ebc62b87fd96caa5921a35647d691f78cce90618fb995 not found: ID does not exist" Oct 01 10:51:39 crc kubenswrapper[4735]: I1001 10:51:39.891408 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:39 crc kubenswrapper[4735]: I1001 10:51:39.926553 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" path="/var/lib/kubelet/pods/174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf/volumes" Oct 01 10:51:39 crc kubenswrapper[4735]: I1001 10:51:39.955444 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:40 crc kubenswrapper[4735]: I1001 10:51:40.800859 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v5rcl"] Oct 01 10:51:41 crc kubenswrapper[4735]: I1001 10:51:41.216424 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v5rcl" podUID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerName="registry-server" containerID="cri-o://b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767" gracePeriod=2 Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.223477 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.227738 4735 generic.go:334] "Generic (PLEG): container finished" podID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerID="b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767" exitCode=0 Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.227778 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5rcl" event={"ID":"eb2b7c02-f428-4eb6-8775-8cef8429e66b","Type":"ContainerDied","Data":"b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767"} Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.227808 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5rcl" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.227823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5rcl" event={"ID":"eb2b7c02-f428-4eb6-8775-8cef8429e66b","Type":"ContainerDied","Data":"e919977f3d95d52f2f0d8f71bd12821ebdfd7cf99f4ca545af788318480cc89b"} Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.227847 4735 scope.go:117] "RemoveContainer" containerID="b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.256285 4735 scope.go:117] "RemoveContainer" containerID="3b892661872338ecdb9faeb2fb2dda77ec9a1f8a0484eca41e9021c41e8156f1" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.273139 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-catalog-content\") pod \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.273285 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-utilities\") pod \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.273468 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrc5q\" (UniqueName: \"kubernetes.io/projected/eb2b7c02-f428-4eb6-8775-8cef8429e66b-kube-api-access-xrc5q\") pod \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\" (UID: \"eb2b7c02-f428-4eb6-8775-8cef8429e66b\") " Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.277380 4735 scope.go:117] "RemoveContainer" containerID="74bcde99a37257db75be90a4c3cd91b532e5807266a2602ccf9627829077c9ab" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.277784 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-utilities" (OuterVolumeSpecName: "utilities") pod "eb2b7c02-f428-4eb6-8775-8cef8429e66b" (UID: "eb2b7c02-f428-4eb6-8775-8cef8429e66b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.280539 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb2b7c02-f428-4eb6-8775-8cef8429e66b-kube-api-access-xrc5q" (OuterVolumeSpecName: "kube-api-access-xrc5q") pod "eb2b7c02-f428-4eb6-8775-8cef8429e66b" (UID: "eb2b7c02-f428-4eb6-8775-8cef8429e66b"). InnerVolumeSpecName "kube-api-access-xrc5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.356462 4735 scope.go:117] "RemoveContainer" containerID="b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767" Oct 01 10:51:42 crc kubenswrapper[4735]: E1001 10:51:42.356985 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767\": container with ID starting with b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767 not found: ID does not exist" containerID="b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.357041 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767"} err="failed to get container status \"b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767\": rpc error: code = NotFound desc = could not find container \"b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767\": container with ID starting with b2a9e5cbab9b491f9fa46c7f36ffa723c543571d6785129d92a4c28c32134767 not found: ID does not exist" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.357076 4735 scope.go:117] "RemoveContainer" containerID="3b892661872338ecdb9faeb2fb2dda77ec9a1f8a0484eca41e9021c41e8156f1" Oct 01 10:51:42 crc kubenswrapper[4735]: E1001 10:51:42.357515 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b892661872338ecdb9faeb2fb2dda77ec9a1f8a0484eca41e9021c41e8156f1\": container with ID starting with 3b892661872338ecdb9faeb2fb2dda77ec9a1f8a0484eca41e9021c41e8156f1 not found: ID does not exist" containerID="3b892661872338ecdb9faeb2fb2dda77ec9a1f8a0484eca41e9021c41e8156f1" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.357554 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b892661872338ecdb9faeb2fb2dda77ec9a1f8a0484eca41e9021c41e8156f1"} err="failed to get container status \"3b892661872338ecdb9faeb2fb2dda77ec9a1f8a0484eca41e9021c41e8156f1\": rpc error: code = NotFound desc = could not find container \"3b892661872338ecdb9faeb2fb2dda77ec9a1f8a0484eca41e9021c41e8156f1\": container with ID starting with 3b892661872338ecdb9faeb2fb2dda77ec9a1f8a0484eca41e9021c41e8156f1 not found: ID does not exist" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.357581 4735 scope.go:117] "RemoveContainer" containerID="74bcde99a37257db75be90a4c3cd91b532e5807266a2602ccf9627829077c9ab" Oct 01 10:51:42 crc kubenswrapper[4735]: E1001 10:51:42.357863 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74bcde99a37257db75be90a4c3cd91b532e5807266a2602ccf9627829077c9ab\": container with ID starting with 74bcde99a37257db75be90a4c3cd91b532e5807266a2602ccf9627829077c9ab not found: ID does not exist" containerID="74bcde99a37257db75be90a4c3cd91b532e5807266a2602ccf9627829077c9ab" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.357894 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74bcde99a37257db75be90a4c3cd91b532e5807266a2602ccf9627829077c9ab"} err="failed to get container status \"74bcde99a37257db75be90a4c3cd91b532e5807266a2602ccf9627829077c9ab\": rpc error: code = NotFound desc = could not find container \"74bcde99a37257db75be90a4c3cd91b532e5807266a2602ccf9627829077c9ab\": container with ID starting with 74bcde99a37257db75be90a4c3cd91b532e5807266a2602ccf9627829077c9ab not found: ID does not exist" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.385009 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.385067 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrc5q\" (UniqueName: \"kubernetes.io/projected/eb2b7c02-f428-4eb6-8775-8cef8429e66b-kube-api-access-xrc5q\") on node \"crc\" DevicePath \"\"" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.394860 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb2b7c02-f428-4eb6-8775-8cef8429e66b" (UID: "eb2b7c02-f428-4eb6-8775-8cef8429e66b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.487121 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb2b7c02-f428-4eb6-8775-8cef8429e66b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.589220 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v5rcl"] Oct 01 10:51:42 crc kubenswrapper[4735]: I1001 10:51:42.603031 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v5rcl"] Oct 01 10:51:43 crc kubenswrapper[4735]: I1001 10:51:43.917232 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" path="/var/lib/kubelet/pods/eb2b7c02-f428-4eb6-8775-8cef8429e66b/volumes" Oct 01 10:52:05 crc kubenswrapper[4735]: I1001 10:52:05.486604 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:52:05 crc kubenswrapper[4735]: I1001 10:52:05.487361 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:52:05 crc kubenswrapper[4735]: I1001 10:52:05.487453 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:52:05 crc kubenswrapper[4735]: I1001 10:52:05.488605 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21d9d4873dd057c624c81ffd0bd8bca0905cb18a9521b7de29a283d745338bf7"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 10:52:05 crc kubenswrapper[4735]: I1001 10:52:05.488744 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://21d9d4873dd057c624c81ffd0bd8bca0905cb18a9521b7de29a283d745338bf7" gracePeriod=600 Oct 01 10:52:06 crc kubenswrapper[4735]: I1001 10:52:06.476077 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="21d9d4873dd057c624c81ffd0bd8bca0905cb18a9521b7de29a283d745338bf7" exitCode=0 Oct 01 10:52:06 crc kubenswrapper[4735]: I1001 10:52:06.476142 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"21d9d4873dd057c624c81ffd0bd8bca0905cb18a9521b7de29a283d745338bf7"} Oct 01 10:52:06 crc kubenswrapper[4735]: I1001 10:52:06.476632 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f"} Oct 01 10:52:06 crc kubenswrapper[4735]: I1001 10:52:06.476649 4735 scope.go:117] "RemoveContainer" containerID="6bb0238a5016780708fe4e2202a4b58f82d3046437841d7c2fad3739fa55b3ba" Oct 01 10:54:05 crc kubenswrapper[4735]: I1001 10:54:05.485909 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:54:05 crc kubenswrapper[4735]: I1001 10:54:05.486740 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:54:35 crc kubenswrapper[4735]: I1001 10:54:35.485784 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:54:35 crc kubenswrapper[4735]: I1001 10:54:35.486340 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:54:42 crc kubenswrapper[4735]: I1001 10:54:42.071426 4735 generic.go:334] "Generic (PLEG): container finished" podID="98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7" containerID="2d220c46ae0c6e564dbc7ea2d409a43a5f8e7e09267225eca8a12088a350094f" exitCode=0 Oct 01 10:54:42 crc kubenswrapper[4735]: I1001 10:54:42.071622 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" event={"ID":"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7","Type":"ContainerDied","Data":"2d220c46ae0c6e564dbc7ea2d409a43a5f8e7e09267225eca8a12088a350094f"} Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.604042 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.714875 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-secret-0\") pod \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.715019 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-inventory\") pod \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.715049 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-ssh-key\") pod \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.715073 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbf8b\" (UniqueName: \"kubernetes.io/projected/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-kube-api-access-xbf8b\") pod \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.715183 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-combined-ca-bundle\") pod \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\" (UID: \"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7\") " Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.723048 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7" (UID: "98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.723481 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-kube-api-access-xbf8b" (OuterVolumeSpecName: "kube-api-access-xbf8b") pod "98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7" (UID: "98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7"). InnerVolumeSpecName "kube-api-access-xbf8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.743335 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7" (UID: "98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.743440 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7" (UID: "98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.744442 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-inventory" (OuterVolumeSpecName: "inventory") pod "98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7" (UID: "98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.817691 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.817728 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbf8b\" (UniqueName: \"kubernetes.io/projected/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-kube-api-access-xbf8b\") on node \"crc\" DevicePath \"\"" Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.817741 4735 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.817752 4735 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:54:43 crc kubenswrapper[4735]: I1001 10:54:43.817761 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.098625 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" event={"ID":"98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7","Type":"ContainerDied","Data":"67d1d4edd39d5283567a3dec046c2812c3dbf26e0de407b8092a4889f106c051"} Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.098912 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67d1d4edd39d5283567a3dec046c2812c3dbf26e0de407b8092a4889f106c051" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.098712 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.180930 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8"] Oct 01 10:54:44 crc kubenswrapper[4735]: E1001 10:54:44.181393 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerName="extract-content" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.181409 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerName="extract-content" Oct 01 10:54:44 crc kubenswrapper[4735]: E1001 10:54:44.181425 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerName="extract-utilities" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.181432 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerName="extract-utilities" Oct 01 10:54:44 crc kubenswrapper[4735]: E1001 10:54:44.181452 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerName="extract-content" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.181459 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerName="extract-content" Oct 01 10:54:44 crc kubenswrapper[4735]: E1001 10:54:44.181535 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerName="registry-server" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.181545 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerName="registry-server" Oct 01 10:54:44 crc kubenswrapper[4735]: E1001 10:54:44.181568 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.181577 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 10:54:44 crc kubenswrapper[4735]: E1001 10:54:44.181596 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerName="extract-utilities" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.181604 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerName="extract-utilities" Oct 01 10:54:44 crc kubenswrapper[4735]: E1001 10:54:44.181612 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerName="registry-server" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.181619 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerName="registry-server" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.181818 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.181840 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb2b7c02-f428-4eb6-8775-8cef8429e66b" containerName="registry-server" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.181857 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="174fbdaf-6e1c-4481-b5ab-a76bdf62b9bf" containerName="registry-server" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.182689 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.193353 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.194048 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.194236 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.194324 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.194397 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.194685 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.194784 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.198140 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8"] Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.328833 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.329504 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.329681 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.329772 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.329868 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/83863343-c31f-484c-9e44-3e6ed41988d8-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.330101 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnvpr\" (UniqueName: \"kubernetes.io/projected/83863343-c31f-484c-9e44-3e6ed41988d8-kube-api-access-vnvpr\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.330163 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.330301 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.330378 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.431533 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnvpr\" (UniqueName: \"kubernetes.io/projected/83863343-c31f-484c-9e44-3e6ed41988d8-kube-api-access-vnvpr\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.431577 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.431611 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.431630 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.431652 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.431678 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.431746 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.431768 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.431787 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/83863343-c31f-484c-9e44-3e6ed41988d8-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.432778 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/83863343-c31f-484c-9e44-3e6ed41988d8-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.437423 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.437735 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.437891 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.439136 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.439281 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.439354 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.439432 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.453800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnvpr\" (UniqueName: \"kubernetes.io/projected/83863343-c31f-484c-9e44-3e6ed41988d8-kube-api-access-vnvpr\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m7tr8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:44 crc kubenswrapper[4735]: I1001 10:54:44.509298 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:54:45 crc kubenswrapper[4735]: I1001 10:54:45.041394 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8"] Oct 01 10:54:45 crc kubenswrapper[4735]: I1001 10:54:45.058304 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 10:54:45 crc kubenswrapper[4735]: I1001 10:54:45.108623 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" event={"ID":"83863343-c31f-484c-9e44-3e6ed41988d8","Type":"ContainerStarted","Data":"2ca4a6b8730f405ef0e9a3c9622da07bca27711e41ad90cb1f7a419f63b61df6"} Oct 01 10:54:46 crc kubenswrapper[4735]: I1001 10:54:46.117689 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" event={"ID":"83863343-c31f-484c-9e44-3e6ed41988d8","Type":"ContainerStarted","Data":"750ba8fb2cb2c053956148d4219e2eab1b7a8dcd7c5fa043a0a490e603c2f834"} Oct 01 10:54:47 crc kubenswrapper[4735]: I1001 10:54:47.149512 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" podStartSLOduration=2.352932194 podStartE2EDuration="3.149473841s" podCreationTimestamp="2025-10-01 10:54:44 +0000 UTC" firstStartedPulling="2025-10-01 10:54:45.057988835 +0000 UTC m=+2243.750810097" lastFinishedPulling="2025-10-01 10:54:45.854530482 +0000 UTC m=+2244.547351744" observedRunningTime="2025-10-01 10:54:47.14079264 +0000 UTC m=+2245.833613912" watchObservedRunningTime="2025-10-01 10:54:47.149473841 +0000 UTC m=+2245.842295113" Oct 01 10:55:05 crc kubenswrapper[4735]: I1001 10:55:05.485374 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 10:55:05 crc kubenswrapper[4735]: I1001 10:55:05.485933 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 10:55:05 crc kubenswrapper[4735]: I1001 10:55:05.485973 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 10:55:05 crc kubenswrapper[4735]: I1001 10:55:05.486785 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 10:55:05 crc kubenswrapper[4735]: I1001 10:55:05.486859 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" gracePeriod=600 Oct 01 10:55:06 crc kubenswrapper[4735]: E1001 10:55:06.292683 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:55:06 crc kubenswrapper[4735]: I1001 10:55:06.317621 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" exitCode=0 Oct 01 10:55:06 crc kubenswrapper[4735]: I1001 10:55:06.317823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f"} Oct 01 10:55:06 crc kubenswrapper[4735]: I1001 10:55:06.317964 4735 scope.go:117] "RemoveContainer" containerID="21d9d4873dd057c624c81ffd0bd8bca0905cb18a9521b7de29a283d745338bf7" Oct 01 10:55:06 crc kubenswrapper[4735]: I1001 10:55:06.318884 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:55:06 crc kubenswrapper[4735]: E1001 10:55:06.319306 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:55:17 crc kubenswrapper[4735]: I1001 10:55:17.897892 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:55:17 crc kubenswrapper[4735]: E1001 10:55:17.898791 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:55:32 crc kubenswrapper[4735]: I1001 10:55:32.897567 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:55:32 crc kubenswrapper[4735]: E1001 10:55:32.898319 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:55:46 crc kubenswrapper[4735]: I1001 10:55:46.897509 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:55:46 crc kubenswrapper[4735]: E1001 10:55:46.898378 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:56:00 crc kubenswrapper[4735]: I1001 10:56:00.897350 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:56:00 crc kubenswrapper[4735]: E1001 10:56:00.898449 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:56:12 crc kubenswrapper[4735]: I1001 10:56:12.897444 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:56:12 crc kubenswrapper[4735]: E1001 10:56:12.898349 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:56:27 crc kubenswrapper[4735]: I1001 10:56:27.898279 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:56:27 crc kubenswrapper[4735]: E1001 10:56:27.899361 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:56:42 crc kubenswrapper[4735]: I1001 10:56:42.897733 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:56:42 crc kubenswrapper[4735]: E1001 10:56:42.898902 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:56:54 crc kubenswrapper[4735]: I1001 10:56:54.899071 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:56:54 crc kubenswrapper[4735]: E1001 10:56:54.900455 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:57:06 crc kubenswrapper[4735]: I1001 10:57:06.897823 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:57:06 crc kubenswrapper[4735]: E1001 10:57:06.899185 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:57:20 crc kubenswrapper[4735]: I1001 10:57:20.897027 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:57:20 crc kubenswrapper[4735]: E1001 10:57:20.898125 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:57:34 crc kubenswrapper[4735]: I1001 10:57:34.897344 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:57:34 crc kubenswrapper[4735]: E1001 10:57:34.900380 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:57:48 crc kubenswrapper[4735]: I1001 10:57:48.897413 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:57:48 crc kubenswrapper[4735]: E1001 10:57:48.898362 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:58:02 crc kubenswrapper[4735]: I1001 10:58:02.897206 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:58:02 crc kubenswrapper[4735]: E1001 10:58:02.898038 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:58:05 crc kubenswrapper[4735]: I1001 10:58:05.126407 4735 generic.go:334] "Generic (PLEG): container finished" podID="83863343-c31f-484c-9e44-3e6ed41988d8" containerID="750ba8fb2cb2c053956148d4219e2eab1b7a8dcd7c5fa043a0a490e603c2f834" exitCode=0 Oct 01 10:58:05 crc kubenswrapper[4735]: I1001 10:58:05.126554 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" event={"ID":"83863343-c31f-484c-9e44-3e6ed41988d8","Type":"ContainerDied","Data":"750ba8fb2cb2c053956148d4219e2eab1b7a8dcd7c5fa043a0a490e603c2f834"} Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.526253 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.679778 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/83863343-c31f-484c-9e44-3e6ed41988d8-nova-extra-config-0\") pod \"83863343-c31f-484c-9e44-3e6ed41988d8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.679854 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-1\") pod \"83863343-c31f-484c-9e44-3e6ed41988d8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.679892 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-1\") pod \"83863343-c31f-484c-9e44-3e6ed41988d8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.679964 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-0\") pod \"83863343-c31f-484c-9e44-3e6ed41988d8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.680050 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-0\") pod \"83863343-c31f-484c-9e44-3e6ed41988d8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.680095 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-ssh-key\") pod \"83863343-c31f-484c-9e44-3e6ed41988d8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.680125 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnvpr\" (UniqueName: \"kubernetes.io/projected/83863343-c31f-484c-9e44-3e6ed41988d8-kube-api-access-vnvpr\") pod \"83863343-c31f-484c-9e44-3e6ed41988d8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.680181 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-inventory\") pod \"83863343-c31f-484c-9e44-3e6ed41988d8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.680311 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-combined-ca-bundle\") pod \"83863343-c31f-484c-9e44-3e6ed41988d8\" (UID: \"83863343-c31f-484c-9e44-3e6ed41988d8\") " Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.695338 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83863343-c31f-484c-9e44-3e6ed41988d8-kube-api-access-vnvpr" (OuterVolumeSpecName: "kube-api-access-vnvpr") pod "83863343-c31f-484c-9e44-3e6ed41988d8" (UID: "83863343-c31f-484c-9e44-3e6ed41988d8"). InnerVolumeSpecName "kube-api-access-vnvpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.695633 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "83863343-c31f-484c-9e44-3e6ed41988d8" (UID: "83863343-c31f-484c-9e44-3e6ed41988d8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.709239 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "83863343-c31f-484c-9e44-3e6ed41988d8" (UID: "83863343-c31f-484c-9e44-3e6ed41988d8"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.709683 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "83863343-c31f-484c-9e44-3e6ed41988d8" (UID: "83863343-c31f-484c-9e44-3e6ed41988d8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.711634 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "83863343-c31f-484c-9e44-3e6ed41988d8" (UID: "83863343-c31f-484c-9e44-3e6ed41988d8"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.717821 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "83863343-c31f-484c-9e44-3e6ed41988d8" (UID: "83863343-c31f-484c-9e44-3e6ed41988d8"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.719809 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "83863343-c31f-484c-9e44-3e6ed41988d8" (UID: "83863343-c31f-484c-9e44-3e6ed41988d8"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.721729 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83863343-c31f-484c-9e44-3e6ed41988d8-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "83863343-c31f-484c-9e44-3e6ed41988d8" (UID: "83863343-c31f-484c-9e44-3e6ed41988d8"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.736212 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-inventory" (OuterVolumeSpecName: "inventory") pod "83863343-c31f-484c-9e44-3e6ed41988d8" (UID: "83863343-c31f-484c-9e44-3e6ed41988d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.782175 4735 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.782209 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.782220 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnvpr\" (UniqueName: \"kubernetes.io/projected/83863343-c31f-484c-9e44-3e6ed41988d8-kube-api-access-vnvpr\") on node \"crc\" DevicePath \"\"" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.782228 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.782237 4735 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.782248 4735 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/83863343-c31f-484c-9e44-3e6ed41988d8-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.782257 4735 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.782266 4735 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 01 10:58:06 crc kubenswrapper[4735]: I1001 10:58:06.782274 4735 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/83863343-c31f-484c-9e44-3e6ed41988d8-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.150834 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" event={"ID":"83863343-c31f-484c-9e44-3e6ed41988d8","Type":"ContainerDied","Data":"2ca4a6b8730f405ef0e9a3c9622da07bca27711e41ad90cb1f7a419f63b61df6"} Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.151454 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ca4a6b8730f405ef0e9a3c9622da07bca27711e41ad90cb1f7a419f63b61df6" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.150965 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m7tr8" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.282056 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq"] Oct 01 10:58:07 crc kubenswrapper[4735]: E1001 10:58:07.283024 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83863343-c31f-484c-9e44-3e6ed41988d8" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.283154 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="83863343-c31f-484c-9e44-3e6ed41988d8" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.283645 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="83863343-c31f-484c-9e44-3e6ed41988d8" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.284616 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.287468 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.293517 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.293793 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.293983 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.294151 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ggpjk" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.294561 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq"] Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.394744 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v67zd\" (UniqueName: \"kubernetes.io/projected/061a2955-62b7-47d5-b62c-abb147006933-kube-api-access-v67zd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.394814 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.394867 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.394912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.395118 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.395207 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.395307 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.497172 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.497237 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.497289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.497312 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.497376 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.497402 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v67zd\" (UniqueName: \"kubernetes.io/projected/061a2955-62b7-47d5-b62c-abb147006933-kube-api-access-v67zd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.497423 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.503987 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.505248 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.505739 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.505903 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.506947 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.510995 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.515344 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v67zd\" (UniqueName: \"kubernetes.io/projected/061a2955-62b7-47d5-b62c-abb147006933-kube-api-access-v67zd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:07 crc kubenswrapper[4735]: I1001 10:58:07.640749 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 10:58:08 crc kubenswrapper[4735]: I1001 10:58:08.190432 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq"] Oct 01 10:58:09 crc kubenswrapper[4735]: I1001 10:58:09.176702 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" event={"ID":"061a2955-62b7-47d5-b62c-abb147006933","Type":"ContainerStarted","Data":"8191c51c3634d32fa3c8611c05d92f607fbb1422f141304bca3aa8cbc5cb6b5b"} Oct 01 10:58:09 crc kubenswrapper[4735]: I1001 10:58:09.178347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" event={"ID":"061a2955-62b7-47d5-b62c-abb147006933","Type":"ContainerStarted","Data":"7c013667ff1dfed54ef52f88c3941b35aec277c13377e3464b128b0c8df3370a"} Oct 01 10:58:09 crc kubenswrapper[4735]: I1001 10:58:09.205078 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" podStartSLOduration=1.6334996849999999 podStartE2EDuration="2.205026456s" podCreationTimestamp="2025-10-01 10:58:07 +0000 UTC" firstStartedPulling="2025-10-01 10:58:08.19277383 +0000 UTC m=+2446.885595092" lastFinishedPulling="2025-10-01 10:58:08.764300591 +0000 UTC m=+2447.457121863" observedRunningTime="2025-10-01 10:58:09.193995972 +0000 UTC m=+2447.886817234" watchObservedRunningTime="2025-10-01 10:58:09.205026456 +0000 UTC m=+2447.897847738" Oct 01 10:58:17 crc kubenswrapper[4735]: I1001 10:58:17.896760 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:58:17 crc kubenswrapper[4735]: E1001 10:58:17.897324 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:58:30 crc kubenswrapper[4735]: I1001 10:58:30.896835 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:58:30 crc kubenswrapper[4735]: E1001 10:58:30.897585 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:58:42 crc kubenswrapper[4735]: I1001 10:58:42.897508 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:58:42 crc kubenswrapper[4735]: E1001 10:58:42.898458 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:58:57 crc kubenswrapper[4735]: I1001 10:58:57.897903 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:58:57 crc kubenswrapper[4735]: E1001 10:58:57.898940 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:59:08 crc kubenswrapper[4735]: I1001 10:59:08.898191 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:59:08 crc kubenswrapper[4735]: E1001 10:59:08.899425 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:59:23 crc kubenswrapper[4735]: I1001 10:59:23.898019 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:59:23 crc kubenswrapper[4735]: E1001 10:59:23.898713 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:59:34 crc kubenswrapper[4735]: I1001 10:59:34.896991 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:59:34 crc kubenswrapper[4735]: E1001 10:59:34.897835 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:59:45 crc kubenswrapper[4735]: I1001 10:59:45.898324 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:59:45 crc kubenswrapper[4735]: E1001 10:59:45.899215 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 10:59:56 crc kubenswrapper[4735]: I1001 10:59:56.897893 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 10:59:56 crc kubenswrapper[4735]: E1001 10:59:56.898721 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.163958 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk"] Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.165959 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.169557 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.171428 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.197279 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk"] Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.283300 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/044894f8-79a3-4626-9bd0-06d1471c1fee-config-volume\") pod \"collect-profiles-29321940-vtvjk\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.283398 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/044894f8-79a3-4626-9bd0-06d1471c1fee-secret-volume\") pod \"collect-profiles-29321940-vtvjk\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.283455 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb9kh\" (UniqueName: \"kubernetes.io/projected/044894f8-79a3-4626-9bd0-06d1471c1fee-kube-api-access-nb9kh\") pod \"collect-profiles-29321940-vtvjk\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.385481 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/044894f8-79a3-4626-9bd0-06d1471c1fee-secret-volume\") pod \"collect-profiles-29321940-vtvjk\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.385583 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb9kh\" (UniqueName: \"kubernetes.io/projected/044894f8-79a3-4626-9bd0-06d1471c1fee-kube-api-access-nb9kh\") pod \"collect-profiles-29321940-vtvjk\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.385642 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/044894f8-79a3-4626-9bd0-06d1471c1fee-config-volume\") pod \"collect-profiles-29321940-vtvjk\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.386534 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/044894f8-79a3-4626-9bd0-06d1471c1fee-config-volume\") pod \"collect-profiles-29321940-vtvjk\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.395707 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/044894f8-79a3-4626-9bd0-06d1471c1fee-secret-volume\") pod \"collect-profiles-29321940-vtvjk\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.404589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb9kh\" (UniqueName: \"kubernetes.io/projected/044894f8-79a3-4626-9bd0-06d1471c1fee-kube-api-access-nb9kh\") pod \"collect-profiles-29321940-vtvjk\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.496127 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:00 crc kubenswrapper[4735]: I1001 11:00:00.963837 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk"] Oct 01 11:00:01 crc kubenswrapper[4735]: I1001 11:00:01.259276 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" event={"ID":"044894f8-79a3-4626-9bd0-06d1471c1fee","Type":"ContainerStarted","Data":"e5463acad2886c4c99324ad982b0d2bcebd8b3f308d2b3ae7343ae2df06ccd00"} Oct 01 11:00:01 crc kubenswrapper[4735]: I1001 11:00:01.259814 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" event={"ID":"044894f8-79a3-4626-9bd0-06d1471c1fee","Type":"ContainerStarted","Data":"ddd565c73090aa4cfec01837662c98fcedf2754c369b8984fa06a25469cc31d7"} Oct 01 11:00:01 crc kubenswrapper[4735]: I1001 11:00:01.297726 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" podStartSLOduration=1.297642806 podStartE2EDuration="1.297642806s" podCreationTimestamp="2025-10-01 11:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:00:01.28121796 +0000 UTC m=+2559.974039262" watchObservedRunningTime="2025-10-01 11:00:01.297642806 +0000 UTC m=+2559.990464088" Oct 01 11:00:02 crc kubenswrapper[4735]: I1001 11:00:02.274721 4735 generic.go:334] "Generic (PLEG): container finished" podID="044894f8-79a3-4626-9bd0-06d1471c1fee" containerID="e5463acad2886c4c99324ad982b0d2bcebd8b3f308d2b3ae7343ae2df06ccd00" exitCode=0 Oct 01 11:00:02 crc kubenswrapper[4735]: I1001 11:00:02.275142 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" event={"ID":"044894f8-79a3-4626-9bd0-06d1471c1fee","Type":"ContainerDied","Data":"e5463acad2886c4c99324ad982b0d2bcebd8b3f308d2b3ae7343ae2df06ccd00"} Oct 01 11:00:03 crc kubenswrapper[4735]: I1001 11:00:03.673634 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:03 crc kubenswrapper[4735]: I1001 11:00:03.852078 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/044894f8-79a3-4626-9bd0-06d1471c1fee-secret-volume\") pod \"044894f8-79a3-4626-9bd0-06d1471c1fee\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " Oct 01 11:00:03 crc kubenswrapper[4735]: I1001 11:00:03.852165 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/044894f8-79a3-4626-9bd0-06d1471c1fee-config-volume\") pod \"044894f8-79a3-4626-9bd0-06d1471c1fee\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " Oct 01 11:00:03 crc kubenswrapper[4735]: I1001 11:00:03.852651 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb9kh\" (UniqueName: \"kubernetes.io/projected/044894f8-79a3-4626-9bd0-06d1471c1fee-kube-api-access-nb9kh\") pod \"044894f8-79a3-4626-9bd0-06d1471c1fee\" (UID: \"044894f8-79a3-4626-9bd0-06d1471c1fee\") " Oct 01 11:00:03 crc kubenswrapper[4735]: I1001 11:00:03.853481 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/044894f8-79a3-4626-9bd0-06d1471c1fee-config-volume" (OuterVolumeSpecName: "config-volume") pod "044894f8-79a3-4626-9bd0-06d1471c1fee" (UID: "044894f8-79a3-4626-9bd0-06d1471c1fee"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:00:03 crc kubenswrapper[4735]: I1001 11:00:03.862479 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044894f8-79a3-4626-9bd0-06d1471c1fee-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "044894f8-79a3-4626-9bd0-06d1471c1fee" (UID: "044894f8-79a3-4626-9bd0-06d1471c1fee"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:00:03 crc kubenswrapper[4735]: I1001 11:00:03.862896 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/044894f8-79a3-4626-9bd0-06d1471c1fee-kube-api-access-nb9kh" (OuterVolumeSpecName: "kube-api-access-nb9kh") pod "044894f8-79a3-4626-9bd0-06d1471c1fee" (UID: "044894f8-79a3-4626-9bd0-06d1471c1fee"). InnerVolumeSpecName "kube-api-access-nb9kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:00:03 crc kubenswrapper[4735]: I1001 11:00:03.955758 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb9kh\" (UniqueName: \"kubernetes.io/projected/044894f8-79a3-4626-9bd0-06d1471c1fee-kube-api-access-nb9kh\") on node \"crc\" DevicePath \"\"" Oct 01 11:00:03 crc kubenswrapper[4735]: I1001 11:00:03.955822 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/044894f8-79a3-4626-9bd0-06d1471c1fee-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 11:00:03 crc kubenswrapper[4735]: I1001 11:00:03.955844 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/044894f8-79a3-4626-9bd0-06d1471c1fee-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 11:00:04 crc kubenswrapper[4735]: I1001 11:00:04.306160 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" event={"ID":"044894f8-79a3-4626-9bd0-06d1471c1fee","Type":"ContainerDied","Data":"ddd565c73090aa4cfec01837662c98fcedf2754c369b8984fa06a25469cc31d7"} Oct 01 11:00:04 crc kubenswrapper[4735]: I1001 11:00:04.306202 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321940-vtvjk" Oct 01 11:00:04 crc kubenswrapper[4735]: I1001 11:00:04.306222 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddd565c73090aa4cfec01837662c98fcedf2754c369b8984fa06a25469cc31d7" Oct 01 11:00:04 crc kubenswrapper[4735]: I1001 11:00:04.389066 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs"] Oct 01 11:00:04 crc kubenswrapper[4735]: I1001 11:00:04.398118 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321895-x8dfs"] Oct 01 11:00:05 crc kubenswrapper[4735]: I1001 11:00:05.916427 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5136cb-51cf-4bc5-8839-d3cbf64e4c39" path="/var/lib/kubelet/pods/fa5136cb-51cf-4bc5-8839-d3cbf64e4c39/volumes" Oct 01 11:00:09 crc kubenswrapper[4735]: I1001 11:00:09.896585 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 11:00:10 crc kubenswrapper[4735]: I1001 11:00:10.366836 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"ac9f83cabc5202081a1059b4f03ea2e0039b7add88f2cc679eb472a1c50e7db1"} Oct 01 11:00:27 crc kubenswrapper[4735]: I1001 11:00:27.219998 4735 scope.go:117] "RemoveContainer" containerID="87713f8cabf71727c4602948dc62159caee081a69a26b407259c0af4fadd7e48" Oct 01 11:00:30 crc kubenswrapper[4735]: I1001 11:00:30.582863 4735 generic.go:334] "Generic (PLEG): container finished" podID="061a2955-62b7-47d5-b62c-abb147006933" containerID="8191c51c3634d32fa3c8611c05d92f607fbb1422f141304bca3aa8cbc5cb6b5b" exitCode=0 Oct 01 11:00:30 crc kubenswrapper[4735]: I1001 11:00:30.582936 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" event={"ID":"061a2955-62b7-47d5-b62c-abb147006933","Type":"ContainerDied","Data":"8191c51c3634d32fa3c8611c05d92f607fbb1422f141304bca3aa8cbc5cb6b5b"} Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.047442 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.163408 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-0\") pod \"061a2955-62b7-47d5-b62c-abb147006933\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.163475 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-telemetry-combined-ca-bundle\") pod \"061a2955-62b7-47d5-b62c-abb147006933\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.163679 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ssh-key\") pod \"061a2955-62b7-47d5-b62c-abb147006933\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.163778 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-inventory\") pod \"061a2955-62b7-47d5-b62c-abb147006933\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.163806 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-2\") pod \"061a2955-62b7-47d5-b62c-abb147006933\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.163847 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v67zd\" (UniqueName: \"kubernetes.io/projected/061a2955-62b7-47d5-b62c-abb147006933-kube-api-access-v67zd\") pod \"061a2955-62b7-47d5-b62c-abb147006933\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.163864 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-1\") pod \"061a2955-62b7-47d5-b62c-abb147006933\" (UID: \"061a2955-62b7-47d5-b62c-abb147006933\") " Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.171177 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "061a2955-62b7-47d5-b62c-abb147006933" (UID: "061a2955-62b7-47d5-b62c-abb147006933"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.186707 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061a2955-62b7-47d5-b62c-abb147006933-kube-api-access-v67zd" (OuterVolumeSpecName: "kube-api-access-v67zd") pod "061a2955-62b7-47d5-b62c-abb147006933" (UID: "061a2955-62b7-47d5-b62c-abb147006933"). InnerVolumeSpecName "kube-api-access-v67zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.201407 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "061a2955-62b7-47d5-b62c-abb147006933" (UID: "061a2955-62b7-47d5-b62c-abb147006933"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.209015 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "061a2955-62b7-47d5-b62c-abb147006933" (UID: "061a2955-62b7-47d5-b62c-abb147006933"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.210396 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "061a2955-62b7-47d5-b62c-abb147006933" (UID: "061a2955-62b7-47d5-b62c-abb147006933"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.220824 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "061a2955-62b7-47d5-b62c-abb147006933" (UID: "061a2955-62b7-47d5-b62c-abb147006933"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.222672 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-inventory" (OuterVolumeSpecName: "inventory") pod "061a2955-62b7-47d5-b62c-abb147006933" (UID: "061a2955-62b7-47d5-b62c-abb147006933"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.266879 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.266916 4735 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.266927 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.266939 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v67zd\" (UniqueName: \"kubernetes.io/projected/061a2955-62b7-47d5-b62c-abb147006933-kube-api-access-v67zd\") on node \"crc\" DevicePath \"\"" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.266950 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.266958 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.266969 4735 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061a2955-62b7-47d5-b62c-abb147006933-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.609212 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" event={"ID":"061a2955-62b7-47d5-b62c-abb147006933","Type":"ContainerDied","Data":"7c013667ff1dfed54ef52f88c3941b35aec277c13377e3464b128b0c8df3370a"} Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.609274 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c013667ff1dfed54ef52f88c3941b35aec277c13377e3464b128b0c8df3370a" Oct 01 11:00:32 crc kubenswrapper[4735]: I1001 11:00:32.609291 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.153567 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29321941-p4wv8"] Oct 01 11:01:00 crc kubenswrapper[4735]: E1001 11:01:00.154862 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044894f8-79a3-4626-9bd0-06d1471c1fee" containerName="collect-profiles" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.154890 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="044894f8-79a3-4626-9bd0-06d1471c1fee" containerName="collect-profiles" Oct 01 11:01:00 crc kubenswrapper[4735]: E1001 11:01:00.154915 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061a2955-62b7-47d5-b62c-abb147006933" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.154931 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="061a2955-62b7-47d5-b62c-abb147006933" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.155390 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="061a2955-62b7-47d5-b62c-abb147006933" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.155418 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="044894f8-79a3-4626-9bd0-06d1471c1fee" containerName="collect-profiles" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.156536 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.162088 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29321941-p4wv8"] Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.210583 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-fernet-keys\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.210652 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-config-data\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.210709 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r5x8\" (UniqueName: \"kubernetes.io/projected/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-kube-api-access-5r5x8\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.210874 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-combined-ca-bundle\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.312961 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-fernet-keys\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.313036 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-config-data\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.313101 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r5x8\" (UniqueName: \"kubernetes.io/projected/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-kube-api-access-5r5x8\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.313208 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-combined-ca-bundle\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.319164 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-config-data\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.320204 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-combined-ca-bundle\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.323581 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-fernet-keys\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.332099 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r5x8\" (UniqueName: \"kubernetes.io/projected/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-kube-api-access-5r5x8\") pod \"keystone-cron-29321941-p4wv8\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.485462 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:00 crc kubenswrapper[4735]: I1001 11:01:00.979352 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29321941-p4wv8"] Oct 01 11:01:01 crc kubenswrapper[4735]: I1001 11:01:01.958863 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29321941-p4wv8" event={"ID":"99c1d24d-af2e-4093-bd99-d1c1cdabd8be","Type":"ContainerStarted","Data":"2f0e8e6d65af2f8d66ad4f8a67e98d75fd0adc547c69c269a589886e01d37f87"} Oct 01 11:01:01 crc kubenswrapper[4735]: I1001 11:01:01.959119 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29321941-p4wv8" event={"ID":"99c1d24d-af2e-4093-bd99-d1c1cdabd8be","Type":"ContainerStarted","Data":"e2f5a60370b1d19945d761e28f0af68e8782a0367814b825f089b3f0774791df"} Oct 01 11:01:01 crc kubenswrapper[4735]: I1001 11:01:01.985643 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29321941-p4wv8" podStartSLOduration=1.985614301 podStartE2EDuration="1.985614301s" podCreationTimestamp="2025-10-01 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:01:01.973396287 +0000 UTC m=+2620.666217549" watchObservedRunningTime="2025-10-01 11:01:01.985614301 +0000 UTC m=+2620.678435563" Oct 01 11:01:03 crc kubenswrapper[4735]: I1001 11:01:03.986138 4735 generic.go:334] "Generic (PLEG): container finished" podID="99c1d24d-af2e-4093-bd99-d1c1cdabd8be" containerID="2f0e8e6d65af2f8d66ad4f8a67e98d75fd0adc547c69c269a589886e01d37f87" exitCode=0 Oct 01 11:01:03 crc kubenswrapper[4735]: I1001 11:01:03.986547 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29321941-p4wv8" event={"ID":"99c1d24d-af2e-4093-bd99-d1c1cdabd8be","Type":"ContainerDied","Data":"2f0e8e6d65af2f8d66ad4f8a67e98d75fd0adc547c69c269a589886e01d37f87"} Oct 01 11:01:03 crc kubenswrapper[4735]: I1001 11:01:03.994326 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7grh6"] Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.031377 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7grh6"] Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.032118 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.096332 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-utilities\") pod \"community-operators-7grh6\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.096480 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgl5p\" (UniqueName: \"kubernetes.io/projected/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-kube-api-access-zgl5p\") pod \"community-operators-7grh6\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.096535 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-catalog-content\") pod \"community-operators-7grh6\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.198156 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgl5p\" (UniqueName: \"kubernetes.io/projected/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-kube-api-access-zgl5p\") pod \"community-operators-7grh6\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.198218 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-catalog-content\") pod \"community-operators-7grh6\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.198362 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-utilities\") pod \"community-operators-7grh6\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.198866 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-utilities\") pod \"community-operators-7grh6\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.199049 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-catalog-content\") pod \"community-operators-7grh6\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.219290 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgl5p\" (UniqueName: \"kubernetes.io/projected/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-kube-api-access-zgl5p\") pod \"community-operators-7grh6\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.362083 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.880237 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7grh6"] Oct 01 11:01:04 crc kubenswrapper[4735]: W1001 11:01:04.898574 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9bf3cf7_d817_4140_a6f8_eb7bc7769f8c.slice/crio-3f59dc0b7a5685fbd571f572426bf26b52b5535960ac0138df34f7b31bff6783 WatchSource:0}: Error finding container 3f59dc0b7a5685fbd571f572426bf26b52b5535960ac0138df34f7b31bff6783: Status 404 returned error can't find the container with id 3f59dc0b7a5685fbd571f572426bf26b52b5535960ac0138df34f7b31bff6783 Oct 01 11:01:04 crc kubenswrapper[4735]: I1001 11:01:04.998987 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7grh6" event={"ID":"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c","Type":"ContainerStarted","Data":"3f59dc0b7a5685fbd571f572426bf26b52b5535960ac0138df34f7b31bff6783"} Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.355564 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.422948 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-config-data\") pod \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.423126 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-fernet-keys\") pod \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.423161 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-combined-ca-bundle\") pod \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.423192 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r5x8\" (UniqueName: \"kubernetes.io/projected/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-kube-api-access-5r5x8\") pod \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\" (UID: \"99c1d24d-af2e-4093-bd99-d1c1cdabd8be\") " Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.430768 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-kube-api-access-5r5x8" (OuterVolumeSpecName: "kube-api-access-5r5x8") pod "99c1d24d-af2e-4093-bd99-d1c1cdabd8be" (UID: "99c1d24d-af2e-4093-bd99-d1c1cdabd8be"). InnerVolumeSpecName "kube-api-access-5r5x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.430888 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "99c1d24d-af2e-4093-bd99-d1c1cdabd8be" (UID: "99c1d24d-af2e-4093-bd99-d1c1cdabd8be"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.457152 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99c1d24d-af2e-4093-bd99-d1c1cdabd8be" (UID: "99c1d24d-af2e-4093-bd99-d1c1cdabd8be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.495531 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-config-data" (OuterVolumeSpecName: "config-data") pod "99c1d24d-af2e-4093-bd99-d1c1cdabd8be" (UID: "99c1d24d-af2e-4093-bd99-d1c1cdabd8be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.525388 4735 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.525443 4735 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.525469 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r5x8\" (UniqueName: \"kubernetes.io/projected/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-kube-api-access-5r5x8\") on node \"crc\" DevicePath \"\"" Oct 01 11:01:05 crc kubenswrapper[4735]: I1001 11:01:05.525492 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c1d24d-af2e-4093-bd99-d1c1cdabd8be-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:01:06 crc kubenswrapper[4735]: I1001 11:01:06.011759 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29321941-p4wv8" event={"ID":"99c1d24d-af2e-4093-bd99-d1c1cdabd8be","Type":"ContainerDied","Data":"e2f5a60370b1d19945d761e28f0af68e8782a0367814b825f089b3f0774791df"} Oct 01 11:01:06 crc kubenswrapper[4735]: I1001 11:01:06.011829 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2f5a60370b1d19945d761e28f0af68e8782a0367814b825f089b3f0774791df" Oct 01 11:01:06 crc kubenswrapper[4735]: I1001 11:01:06.011836 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29321941-p4wv8" Oct 01 11:01:06 crc kubenswrapper[4735]: I1001 11:01:06.014324 4735 generic.go:334] "Generic (PLEG): container finished" podID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" containerID="8b0cf30336c31980d621f48fed3a862c70dc622902256109c55103cfc575966f" exitCode=0 Oct 01 11:01:06 crc kubenswrapper[4735]: I1001 11:01:06.014366 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7grh6" event={"ID":"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c","Type":"ContainerDied","Data":"8b0cf30336c31980d621f48fed3a862c70dc622902256109c55103cfc575966f"} Oct 01 11:01:06 crc kubenswrapper[4735]: I1001 11:01:06.017132 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 11:01:08 crc kubenswrapper[4735]: I1001 11:01:08.041670 4735 generic.go:334] "Generic (PLEG): container finished" podID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" containerID="1638d5250e66c9b4a91223e09da6dce06d8f6f2216fbf461b7485b8a4bce2840" exitCode=0 Oct 01 11:01:08 crc kubenswrapper[4735]: I1001 11:01:08.042208 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7grh6" event={"ID":"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c","Type":"ContainerDied","Data":"1638d5250e66c9b4a91223e09da6dce06d8f6f2216fbf461b7485b8a4bce2840"} Oct 01 11:01:09 crc kubenswrapper[4735]: I1001 11:01:09.057807 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7grh6" event={"ID":"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c","Type":"ContainerStarted","Data":"50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58"} Oct 01 11:01:09 crc kubenswrapper[4735]: I1001 11:01:09.094082 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7grh6" podStartSLOduration=3.538553715 podStartE2EDuration="6.094043239s" podCreationTimestamp="2025-10-01 11:01:03 +0000 UTC" firstStartedPulling="2025-10-01 11:01:06.016610206 +0000 UTC m=+2624.709431468" lastFinishedPulling="2025-10-01 11:01:08.57209969 +0000 UTC m=+2627.264920992" observedRunningTime="2025-10-01 11:01:09.081136997 +0000 UTC m=+2627.773958309" watchObservedRunningTime="2025-10-01 11:01:09.094043239 +0000 UTC m=+2627.786864511" Oct 01 11:01:14 crc kubenswrapper[4735]: I1001 11:01:14.363167 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:14 crc kubenswrapper[4735]: I1001 11:01:14.363732 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:14 crc kubenswrapper[4735]: I1001 11:01:14.441468 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:15 crc kubenswrapper[4735]: I1001 11:01:15.212290 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:15 crc kubenswrapper[4735]: I1001 11:01:15.272002 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7grh6"] Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.144675 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7grh6" podUID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" containerName="registry-server" containerID="cri-o://50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58" gracePeriod=2 Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.726186 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.738947 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 11:01:17 crc kubenswrapper[4735]: E1001 11:01:17.739565 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" containerName="extract-content" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.739618 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" containerName="extract-content" Oct 01 11:01:17 crc kubenswrapper[4735]: E1001 11:01:17.739640 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" containerName="extract-utilities" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.739654 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" containerName="extract-utilities" Oct 01 11:01:17 crc kubenswrapper[4735]: E1001 11:01:17.739689 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" containerName="registry-server" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.739701 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" containerName="registry-server" Oct 01 11:01:17 crc kubenswrapper[4735]: E1001 11:01:17.739719 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c1d24d-af2e-4093-bd99-d1c1cdabd8be" containerName="keystone-cron" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.739730 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c1d24d-af2e-4093-bd99-d1c1cdabd8be" containerName="keystone-cron" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.740196 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c1d24d-af2e-4093-bd99-d1c1cdabd8be" containerName="keystone-cron" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.740240 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" containerName="registry-server" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.741275 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.744691 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.745276 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dlsqd" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.745381 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.745713 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.754111 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.887361 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgl5p\" (UniqueName: \"kubernetes.io/projected/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-kube-api-access-zgl5p\") pod \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.887609 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-catalog-content\") pod \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.887762 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-utilities\") pod \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\" (UID: \"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c\") " Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.888220 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.888331 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.888375 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.888406 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.888465 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.888581 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kmxl\" (UniqueName: \"kubernetes.io/projected/66ee97cc-ac56-4879-b605-e2a9347213ca-kube-api-access-5kmxl\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.888745 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.888856 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-config-data\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.888965 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.889603 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-utilities" (OuterVolumeSpecName: "utilities") pod "b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" (UID: "b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.897472 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-kube-api-access-zgl5p" (OuterVolumeSpecName: "kube-api-access-zgl5p") pod "b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" (UID: "b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c"). InnerVolumeSpecName "kube-api-access-zgl5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.971961 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" (UID: "b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.990815 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.991708 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.992102 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.992151 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.992179 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.992222 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.992257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kmxl\" (UniqueName: \"kubernetes.io/projected/66ee97cc-ac56-4879-b605-e2a9347213ca-kube-api-access-5kmxl\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.992390 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.992590 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.993130 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.993414 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.992492 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-config-data\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.993806 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.994107 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-config-data\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.994277 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgl5p\" (UniqueName: \"kubernetes.io/projected/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-kube-api-access-zgl5p\") on node \"crc\" DevicePath \"\"" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.994313 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.994328 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:01:17 crc kubenswrapper[4735]: I1001 11:01:17.996771 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.003246 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.020723 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kmxl\" (UniqueName: \"kubernetes.io/projected/66ee97cc-ac56-4879-b605-e2a9347213ca-kube-api-access-5kmxl\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.032267 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.059092 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " pod="openstack/tempest-tests-tempest" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.074718 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.158745 4735 generic.go:334] "Generic (PLEG): container finished" podID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" containerID="50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58" exitCode=0 Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.159079 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7grh6" event={"ID":"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c","Type":"ContainerDied","Data":"50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58"} Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.159108 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7grh6" event={"ID":"b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c","Type":"ContainerDied","Data":"3f59dc0b7a5685fbd571f572426bf26b52b5535960ac0138df34f7b31bff6783"} Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.159127 4735 scope.go:117] "RemoveContainer" containerID="50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.159264 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7grh6" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.228694 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7grh6"] Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.228746 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7grh6"] Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.235940 4735 scope.go:117] "RemoveContainer" containerID="1638d5250e66c9b4a91223e09da6dce06d8f6f2216fbf461b7485b8a4bce2840" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.268551 4735 scope.go:117] "RemoveContainer" containerID="8b0cf30336c31980d621f48fed3a862c70dc622902256109c55103cfc575966f" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.346119 4735 scope.go:117] "RemoveContainer" containerID="50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58" Oct 01 11:01:18 crc kubenswrapper[4735]: E1001 11:01:18.347039 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58\": container with ID starting with 50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58 not found: ID does not exist" containerID="50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.347073 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58"} err="failed to get container status \"50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58\": rpc error: code = NotFound desc = could not find container \"50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58\": container with ID starting with 50ddb8adad66048dd0ac654f57a38a090e88fcf97e0176940453e526fb466a58 not found: ID does not exist" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.347093 4735 scope.go:117] "RemoveContainer" containerID="1638d5250e66c9b4a91223e09da6dce06d8f6f2216fbf461b7485b8a4bce2840" Oct 01 11:01:18 crc kubenswrapper[4735]: E1001 11:01:18.347823 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1638d5250e66c9b4a91223e09da6dce06d8f6f2216fbf461b7485b8a4bce2840\": container with ID starting with 1638d5250e66c9b4a91223e09da6dce06d8f6f2216fbf461b7485b8a4bce2840 not found: ID does not exist" containerID="1638d5250e66c9b4a91223e09da6dce06d8f6f2216fbf461b7485b8a4bce2840" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.347847 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1638d5250e66c9b4a91223e09da6dce06d8f6f2216fbf461b7485b8a4bce2840"} err="failed to get container status \"1638d5250e66c9b4a91223e09da6dce06d8f6f2216fbf461b7485b8a4bce2840\": rpc error: code = NotFound desc = could not find container \"1638d5250e66c9b4a91223e09da6dce06d8f6f2216fbf461b7485b8a4bce2840\": container with ID starting with 1638d5250e66c9b4a91223e09da6dce06d8f6f2216fbf461b7485b8a4bce2840 not found: ID does not exist" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.347862 4735 scope.go:117] "RemoveContainer" containerID="8b0cf30336c31980d621f48fed3a862c70dc622902256109c55103cfc575966f" Oct 01 11:01:18 crc kubenswrapper[4735]: E1001 11:01:18.348161 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b0cf30336c31980d621f48fed3a862c70dc622902256109c55103cfc575966f\": container with ID starting with 8b0cf30336c31980d621f48fed3a862c70dc622902256109c55103cfc575966f not found: ID does not exist" containerID="8b0cf30336c31980d621f48fed3a862c70dc622902256109c55103cfc575966f" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.348182 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0cf30336c31980d621f48fed3a862c70dc622902256109c55103cfc575966f"} err="failed to get container status \"8b0cf30336c31980d621f48fed3a862c70dc622902256109c55103cfc575966f\": rpc error: code = NotFound desc = could not find container \"8b0cf30336c31980d621f48fed3a862c70dc622902256109c55103cfc575966f\": container with ID starting with 8b0cf30336c31980d621f48fed3a862c70dc622902256109c55103cfc575966f not found: ID does not exist" Oct 01 11:01:18 crc kubenswrapper[4735]: I1001 11:01:18.610102 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 11:01:19 crc kubenswrapper[4735]: I1001 11:01:19.174487 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66ee97cc-ac56-4879-b605-e2a9347213ca","Type":"ContainerStarted","Data":"52d3827b8ebb42159d843dc0cb7d4ac02ab4e0373f52d8b083c88ca63459449e"} Oct 01 11:01:19 crc kubenswrapper[4735]: I1001 11:01:19.918979 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c" path="/var/lib/kubelet/pods/b9bf3cf7-d817-4140-a6f8-eb7bc7769f8c/volumes" Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.437338 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rt2qw"] Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.439889 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.451105 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rt2qw"] Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.498201 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-268l4\" (UniqueName: \"kubernetes.io/projected/8079fe81-70d9-4714-952e-b6b37e6facb7-kube-api-access-268l4\") pod \"certified-operators-rt2qw\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.498276 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-utilities\") pod \"certified-operators-rt2qw\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.498402 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-catalog-content\") pod \"certified-operators-rt2qw\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.601049 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-268l4\" (UniqueName: \"kubernetes.io/projected/8079fe81-70d9-4714-952e-b6b37e6facb7-kube-api-access-268l4\") pod \"certified-operators-rt2qw\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.601198 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-utilities\") pod \"certified-operators-rt2qw\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.601383 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-catalog-content\") pod \"certified-operators-rt2qw\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.602131 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-utilities\") pod \"certified-operators-rt2qw\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.602274 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-catalog-content\") pod \"certified-operators-rt2qw\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.632680 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-268l4\" (UniqueName: \"kubernetes.io/projected/8079fe81-70d9-4714-952e-b6b37e6facb7-kube-api-access-268l4\") pod \"certified-operators-rt2qw\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:01:40 crc kubenswrapper[4735]: I1001 11:01:40.761260 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:01:48 crc kubenswrapper[4735]: I1001 11:01:48.856916 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vkbtx"] Oct 01 11:01:48 crc kubenswrapper[4735]: I1001 11:01:48.859952 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:48 crc kubenswrapper[4735]: I1001 11:01:48.866339 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vkbtx"] Oct 01 11:01:48 crc kubenswrapper[4735]: I1001 11:01:48.989225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vkd6\" (UniqueName: \"kubernetes.io/projected/79e0b139-dfc9-41fa-9594-7e71650e92c5-kube-api-access-4vkd6\") pod \"redhat-operators-vkbtx\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:48 crc kubenswrapper[4735]: I1001 11:01:48.989305 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-catalog-content\") pod \"redhat-operators-vkbtx\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:48 crc kubenswrapper[4735]: I1001 11:01:48.989371 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-utilities\") pod \"redhat-operators-vkbtx\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:49 crc kubenswrapper[4735]: I1001 11:01:49.091611 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-catalog-content\") pod \"redhat-operators-vkbtx\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:49 crc kubenswrapper[4735]: I1001 11:01:49.091743 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-utilities\") pod \"redhat-operators-vkbtx\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:49 crc kubenswrapper[4735]: I1001 11:01:49.091894 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vkd6\" (UniqueName: \"kubernetes.io/projected/79e0b139-dfc9-41fa-9594-7e71650e92c5-kube-api-access-4vkd6\") pod \"redhat-operators-vkbtx\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:49 crc kubenswrapper[4735]: I1001 11:01:49.092248 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-catalog-content\") pod \"redhat-operators-vkbtx\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:49 crc kubenswrapper[4735]: I1001 11:01:49.095308 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-utilities\") pod \"redhat-operators-vkbtx\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:49 crc kubenswrapper[4735]: I1001 11:01:49.115431 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vkd6\" (UniqueName: \"kubernetes.io/projected/79e0b139-dfc9-41fa-9594-7e71650e92c5-kube-api-access-4vkd6\") pod \"redhat-operators-vkbtx\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:49 crc kubenswrapper[4735]: I1001 11:01:49.192185 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:50 crc kubenswrapper[4735]: E1001 11:01:50.267728 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 01 11:01:50 crc kubenswrapper[4735]: E1001 11:01:50.268675 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5kmxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(66ee97cc-ac56-4879-b605-e2a9347213ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 11:01:50 crc kubenswrapper[4735]: E1001 11:01:50.270282 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="66ee97cc-ac56-4879-b605-e2a9347213ca" Oct 01 11:01:50 crc kubenswrapper[4735]: E1001 11:01:50.501677 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="66ee97cc-ac56-4879-b605-e2a9347213ca" Oct 01 11:01:50 crc kubenswrapper[4735]: I1001 11:01:50.625060 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vkbtx"] Oct 01 11:01:50 crc kubenswrapper[4735]: I1001 11:01:50.699088 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rt2qw"] Oct 01 11:01:50 crc kubenswrapper[4735]: W1001 11:01:50.703740 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8079fe81_70d9_4714_952e_b6b37e6facb7.slice/crio-e695fe7cf594cbd0f94afbfd2f889dd2bc8e476e02bb3e540af0e515f58e2b2f WatchSource:0}: Error finding container e695fe7cf594cbd0f94afbfd2f889dd2bc8e476e02bb3e540af0e515f58e2b2f: Status 404 returned error can't find the container with id e695fe7cf594cbd0f94afbfd2f889dd2bc8e476e02bb3e540af0e515f58e2b2f Oct 01 11:01:51 crc kubenswrapper[4735]: I1001 11:01:51.510599 4735 generic.go:334] "Generic (PLEG): container finished" podID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerID="3e580f4eea5bb462108354a7a2650ea71c37b0f841b784ef54e88d9c4efb2f75" exitCode=0 Oct 01 11:01:51 crc kubenswrapper[4735]: I1001 11:01:51.510872 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkbtx" event={"ID":"79e0b139-dfc9-41fa-9594-7e71650e92c5","Type":"ContainerDied","Data":"3e580f4eea5bb462108354a7a2650ea71c37b0f841b784ef54e88d9c4efb2f75"} Oct 01 11:01:51 crc kubenswrapper[4735]: I1001 11:01:51.510898 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkbtx" event={"ID":"79e0b139-dfc9-41fa-9594-7e71650e92c5","Type":"ContainerStarted","Data":"ae440cf9be28fafe162d1b66ab2e3eee4acd5a6cd6931d92e0f63ef45baa925d"} Oct 01 11:01:51 crc kubenswrapper[4735]: I1001 11:01:51.516649 4735 generic.go:334] "Generic (PLEG): container finished" podID="8079fe81-70d9-4714-952e-b6b37e6facb7" containerID="e5adf599f23fae928670e772171e04d983235415cb8953719310dca65ffcecf4" exitCode=0 Oct 01 11:01:51 crc kubenswrapper[4735]: I1001 11:01:51.516980 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rt2qw" event={"ID":"8079fe81-70d9-4714-952e-b6b37e6facb7","Type":"ContainerDied","Data":"e5adf599f23fae928670e772171e04d983235415cb8953719310dca65ffcecf4"} Oct 01 11:01:51 crc kubenswrapper[4735]: I1001 11:01:51.517017 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rt2qw" event={"ID":"8079fe81-70d9-4714-952e-b6b37e6facb7","Type":"ContainerStarted","Data":"e695fe7cf594cbd0f94afbfd2f889dd2bc8e476e02bb3e540af0e515f58e2b2f"} Oct 01 11:01:53 crc kubenswrapper[4735]: I1001 11:01:53.541647 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkbtx" event={"ID":"79e0b139-dfc9-41fa-9594-7e71650e92c5","Type":"ContainerStarted","Data":"470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3"} Oct 01 11:01:53 crc kubenswrapper[4735]: I1001 11:01:53.545273 4735 generic.go:334] "Generic (PLEG): container finished" podID="8079fe81-70d9-4714-952e-b6b37e6facb7" containerID="d05507e618d21f4330c5d1dcefa2ab60ab86d898a0d94ac8dfbe4b342d30c47a" exitCode=0 Oct 01 11:01:53 crc kubenswrapper[4735]: I1001 11:01:53.545317 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rt2qw" event={"ID":"8079fe81-70d9-4714-952e-b6b37e6facb7","Type":"ContainerDied","Data":"d05507e618d21f4330c5d1dcefa2ab60ab86d898a0d94ac8dfbe4b342d30c47a"} Oct 01 11:01:54 crc kubenswrapper[4735]: I1001 11:01:54.560546 4735 generic.go:334] "Generic (PLEG): container finished" podID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerID="470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3" exitCode=0 Oct 01 11:01:54 crc kubenswrapper[4735]: I1001 11:01:54.560641 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkbtx" event={"ID":"79e0b139-dfc9-41fa-9594-7e71650e92c5","Type":"ContainerDied","Data":"470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3"} Oct 01 11:01:54 crc kubenswrapper[4735]: I1001 11:01:54.566480 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rt2qw" event={"ID":"8079fe81-70d9-4714-952e-b6b37e6facb7","Type":"ContainerStarted","Data":"9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2"} Oct 01 11:01:54 crc kubenswrapper[4735]: I1001 11:01:54.610595 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rt2qw" podStartSLOduration=12.017371643 podStartE2EDuration="14.610570297s" podCreationTimestamp="2025-10-01 11:01:40 +0000 UTC" firstStartedPulling="2025-10-01 11:01:51.519281967 +0000 UTC m=+2670.212103229" lastFinishedPulling="2025-10-01 11:01:54.112480581 +0000 UTC m=+2672.805301883" observedRunningTime="2025-10-01 11:01:54.60200294 +0000 UTC m=+2673.294824212" watchObservedRunningTime="2025-10-01 11:01:54.610570297 +0000 UTC m=+2673.303391589" Oct 01 11:01:55 crc kubenswrapper[4735]: I1001 11:01:55.577640 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkbtx" event={"ID":"79e0b139-dfc9-41fa-9594-7e71650e92c5","Type":"ContainerStarted","Data":"6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f"} Oct 01 11:01:55 crc kubenswrapper[4735]: I1001 11:01:55.599174 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vkbtx" podStartSLOduration=4.029617837 podStartE2EDuration="7.599155848s" podCreationTimestamp="2025-10-01 11:01:48 +0000 UTC" firstStartedPulling="2025-10-01 11:01:51.513320229 +0000 UTC m=+2670.206141531" lastFinishedPulling="2025-10-01 11:01:55.08285824 +0000 UTC m=+2673.775679542" observedRunningTime="2025-10-01 11:01:55.598292525 +0000 UTC m=+2674.291113787" watchObservedRunningTime="2025-10-01 11:01:55.599155848 +0000 UTC m=+2674.291977130" Oct 01 11:01:59 crc kubenswrapper[4735]: I1001 11:01:59.193163 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:01:59 crc kubenswrapper[4735]: I1001 11:01:59.193909 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:02:00 crc kubenswrapper[4735]: I1001 11:02:00.253086 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vkbtx" podUID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerName="registry-server" probeResult="failure" output=< Oct 01 11:02:00 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 01 11:02:00 crc kubenswrapper[4735]: > Oct 01 11:02:00 crc kubenswrapper[4735]: I1001 11:02:00.762813 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:02:00 crc kubenswrapper[4735]: I1001 11:02:00.762884 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:02:00 crc kubenswrapper[4735]: I1001 11:02:00.831536 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:02:01 crc kubenswrapper[4735]: I1001 11:02:01.702934 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:02:01 crc kubenswrapper[4735]: I1001 11:02:01.779102 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rt2qw"] Oct 01 11:02:03 crc kubenswrapper[4735]: I1001 11:02:03.673753 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rt2qw" podUID="8079fe81-70d9-4714-952e-b6b37e6facb7" containerName="registry-server" containerID="cri-o://9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2" gracePeriod=2 Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.521323 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.701356 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.720311 4735 generic.go:334] "Generic (PLEG): container finished" podID="8079fe81-70d9-4714-952e-b6b37e6facb7" containerID="9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2" exitCode=0 Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.720428 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rt2qw" event={"ID":"8079fe81-70d9-4714-952e-b6b37e6facb7","Type":"ContainerDied","Data":"9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2"} Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.720513 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rt2qw" event={"ID":"8079fe81-70d9-4714-952e-b6b37e6facb7","Type":"ContainerDied","Data":"e695fe7cf594cbd0f94afbfd2f889dd2bc8e476e02bb3e540af0e515f58e2b2f"} Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.720542 4735 scope.go:117] "RemoveContainer" containerID="9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.720986 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rt2qw" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.741061 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-catalog-content\") pod \"8079fe81-70d9-4714-952e-b6b37e6facb7\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.741148 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-utilities\") pod \"8079fe81-70d9-4714-952e-b6b37e6facb7\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.741232 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-268l4\" (UniqueName: \"kubernetes.io/projected/8079fe81-70d9-4714-952e-b6b37e6facb7-kube-api-access-268l4\") pod \"8079fe81-70d9-4714-952e-b6b37e6facb7\" (UID: \"8079fe81-70d9-4714-952e-b6b37e6facb7\") " Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.742326 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-utilities" (OuterVolumeSpecName: "utilities") pod "8079fe81-70d9-4714-952e-b6b37e6facb7" (UID: "8079fe81-70d9-4714-952e-b6b37e6facb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.748319 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8079fe81-70d9-4714-952e-b6b37e6facb7-kube-api-access-268l4" (OuterVolumeSpecName: "kube-api-access-268l4") pod "8079fe81-70d9-4714-952e-b6b37e6facb7" (UID: "8079fe81-70d9-4714-952e-b6b37e6facb7"). InnerVolumeSpecName "kube-api-access-268l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.773909 4735 scope.go:117] "RemoveContainer" containerID="d05507e618d21f4330c5d1dcefa2ab60ab86d898a0d94ac8dfbe4b342d30c47a" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.795790 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8079fe81-70d9-4714-952e-b6b37e6facb7" (UID: "8079fe81-70d9-4714-952e-b6b37e6facb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.816744 4735 scope.go:117] "RemoveContainer" containerID="e5adf599f23fae928670e772171e04d983235415cb8953719310dca65ffcecf4" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.843250 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-268l4\" (UniqueName: \"kubernetes.io/projected/8079fe81-70d9-4714-952e-b6b37e6facb7-kube-api-access-268l4\") on node \"crc\" DevicePath \"\"" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.843289 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.843299 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8079fe81-70d9-4714-952e-b6b37e6facb7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.851243 4735 scope.go:117] "RemoveContainer" containerID="9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2" Oct 01 11:02:05 crc kubenswrapper[4735]: E1001 11:02:05.851843 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2\": container with ID starting with 9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2 not found: ID does not exist" containerID="9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.851885 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2"} err="failed to get container status \"9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2\": rpc error: code = NotFound desc = could not find container \"9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2\": container with ID starting with 9e3829b645ef3b6f09db7fdaafaaee001433da3de1c8bef4acf145b4db402bd2 not found: ID does not exist" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.851909 4735 scope.go:117] "RemoveContainer" containerID="d05507e618d21f4330c5d1dcefa2ab60ab86d898a0d94ac8dfbe4b342d30c47a" Oct 01 11:02:05 crc kubenswrapper[4735]: E1001 11:02:05.852308 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d05507e618d21f4330c5d1dcefa2ab60ab86d898a0d94ac8dfbe4b342d30c47a\": container with ID starting with d05507e618d21f4330c5d1dcefa2ab60ab86d898a0d94ac8dfbe4b342d30c47a not found: ID does not exist" containerID="d05507e618d21f4330c5d1dcefa2ab60ab86d898a0d94ac8dfbe4b342d30c47a" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.852330 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d05507e618d21f4330c5d1dcefa2ab60ab86d898a0d94ac8dfbe4b342d30c47a"} err="failed to get container status \"d05507e618d21f4330c5d1dcefa2ab60ab86d898a0d94ac8dfbe4b342d30c47a\": rpc error: code = NotFound desc = could not find container \"d05507e618d21f4330c5d1dcefa2ab60ab86d898a0d94ac8dfbe4b342d30c47a\": container with ID starting with d05507e618d21f4330c5d1dcefa2ab60ab86d898a0d94ac8dfbe4b342d30c47a not found: ID does not exist" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.852344 4735 scope.go:117] "RemoveContainer" containerID="e5adf599f23fae928670e772171e04d983235415cb8953719310dca65ffcecf4" Oct 01 11:02:05 crc kubenswrapper[4735]: E1001 11:02:05.853108 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5adf599f23fae928670e772171e04d983235415cb8953719310dca65ffcecf4\": container with ID starting with e5adf599f23fae928670e772171e04d983235415cb8953719310dca65ffcecf4 not found: ID does not exist" containerID="e5adf599f23fae928670e772171e04d983235415cb8953719310dca65ffcecf4" Oct 01 11:02:05 crc kubenswrapper[4735]: I1001 11:02:05.853136 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5adf599f23fae928670e772171e04d983235415cb8953719310dca65ffcecf4"} err="failed to get container status \"e5adf599f23fae928670e772171e04d983235415cb8953719310dca65ffcecf4\": rpc error: code = NotFound desc = could not find container \"e5adf599f23fae928670e772171e04d983235415cb8953719310dca65ffcecf4\": container with ID starting with e5adf599f23fae928670e772171e04d983235415cb8953719310dca65ffcecf4 not found: ID does not exist" Oct 01 11:02:06 crc kubenswrapper[4735]: I1001 11:02:06.056284 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rt2qw"] Oct 01 11:02:06 crc kubenswrapper[4735]: I1001 11:02:06.066394 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rt2qw"] Oct 01 11:02:06 crc kubenswrapper[4735]: I1001 11:02:06.742759 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66ee97cc-ac56-4879-b605-e2a9347213ca","Type":"ContainerStarted","Data":"c66c2a90fcf19c0c94e9ab286bd40fbe3f728bc2c931291136def5027cdcae1b"} Oct 01 11:02:06 crc kubenswrapper[4735]: I1001 11:02:06.781725 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.8967087620000003 podStartE2EDuration="50.781702122s" podCreationTimestamp="2025-10-01 11:01:16 +0000 UTC" firstStartedPulling="2025-10-01 11:01:18.631704103 +0000 UTC m=+2637.324525365" lastFinishedPulling="2025-10-01 11:02:05.516697453 +0000 UTC m=+2684.209518725" observedRunningTime="2025-10-01 11:02:06.774302656 +0000 UTC m=+2685.467123918" watchObservedRunningTime="2025-10-01 11:02:06.781702122 +0000 UTC m=+2685.474523384" Oct 01 11:02:07 crc kubenswrapper[4735]: I1001 11:02:07.919982 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8079fe81-70d9-4714-952e-b6b37e6facb7" path="/var/lib/kubelet/pods/8079fe81-70d9-4714-952e-b6b37e6facb7/volumes" Oct 01 11:02:09 crc kubenswrapper[4735]: I1001 11:02:09.246184 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:02:09 crc kubenswrapper[4735]: I1001 11:02:09.310241 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:02:09 crc kubenswrapper[4735]: I1001 11:02:09.485714 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vkbtx"] Oct 01 11:02:10 crc kubenswrapper[4735]: I1001 11:02:10.792967 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vkbtx" podUID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerName="registry-server" containerID="cri-o://6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f" gracePeriod=2 Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.230422 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.393386 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-utilities\") pod \"79e0b139-dfc9-41fa-9594-7e71650e92c5\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.394096 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-catalog-content\") pod \"79e0b139-dfc9-41fa-9594-7e71650e92c5\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.394326 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vkd6\" (UniqueName: \"kubernetes.io/projected/79e0b139-dfc9-41fa-9594-7e71650e92c5-kube-api-access-4vkd6\") pod \"79e0b139-dfc9-41fa-9594-7e71650e92c5\" (UID: \"79e0b139-dfc9-41fa-9594-7e71650e92c5\") " Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.394437 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-utilities" (OuterVolumeSpecName: "utilities") pod "79e0b139-dfc9-41fa-9594-7e71650e92c5" (UID: "79e0b139-dfc9-41fa-9594-7e71650e92c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.394969 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.409744 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e0b139-dfc9-41fa-9594-7e71650e92c5-kube-api-access-4vkd6" (OuterVolumeSpecName: "kube-api-access-4vkd6") pod "79e0b139-dfc9-41fa-9594-7e71650e92c5" (UID: "79e0b139-dfc9-41fa-9594-7e71650e92c5"). InnerVolumeSpecName "kube-api-access-4vkd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.470824 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79e0b139-dfc9-41fa-9594-7e71650e92c5" (UID: "79e0b139-dfc9-41fa-9594-7e71650e92c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.496335 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vkd6\" (UniqueName: \"kubernetes.io/projected/79e0b139-dfc9-41fa-9594-7e71650e92c5-kube-api-access-4vkd6\") on node \"crc\" DevicePath \"\"" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.496365 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e0b139-dfc9-41fa-9594-7e71650e92c5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.802372 4735 generic.go:334] "Generic (PLEG): container finished" podID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerID="6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f" exitCode=0 Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.802440 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkbtx" event={"ID":"79e0b139-dfc9-41fa-9594-7e71650e92c5","Type":"ContainerDied","Data":"6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f"} Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.802459 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vkbtx" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.802486 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vkbtx" event={"ID":"79e0b139-dfc9-41fa-9594-7e71650e92c5","Type":"ContainerDied","Data":"ae440cf9be28fafe162d1b66ab2e3eee4acd5a6cd6931d92e0f63ef45baa925d"} Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.802529 4735 scope.go:117] "RemoveContainer" containerID="6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.825955 4735 scope.go:117] "RemoveContainer" containerID="470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.853004 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vkbtx"] Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.856954 4735 scope.go:117] "RemoveContainer" containerID="3e580f4eea5bb462108354a7a2650ea71c37b0f841b784ef54e88d9c4efb2f75" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.860479 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vkbtx"] Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.916774 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e0b139-dfc9-41fa-9594-7e71650e92c5" path="/var/lib/kubelet/pods/79e0b139-dfc9-41fa-9594-7e71650e92c5/volumes" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.937620 4735 scope.go:117] "RemoveContainer" containerID="6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f" Oct 01 11:02:11 crc kubenswrapper[4735]: E1001 11:02:11.938088 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f\": container with ID starting with 6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f not found: ID does not exist" containerID="6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.938133 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f"} err="failed to get container status \"6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f\": rpc error: code = NotFound desc = could not find container \"6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f\": container with ID starting with 6b7aeacfacb75222fe1cc71331c6ce5dfe4f09359c720f2ac9bea0a98cf7ba1f not found: ID does not exist" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.938159 4735 scope.go:117] "RemoveContainer" containerID="470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3" Oct 01 11:02:11 crc kubenswrapper[4735]: E1001 11:02:11.939092 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3\": container with ID starting with 470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3 not found: ID does not exist" containerID="470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.939241 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3"} err="failed to get container status \"470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3\": rpc error: code = NotFound desc = could not find container \"470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3\": container with ID starting with 470b4a552ae20a8ce06aa64cde1bbf4c7e5163a40113af456d997897db233ae3 not found: ID does not exist" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.939331 4735 scope.go:117] "RemoveContainer" containerID="3e580f4eea5bb462108354a7a2650ea71c37b0f841b784ef54e88d9c4efb2f75" Oct 01 11:02:11 crc kubenswrapper[4735]: E1001 11:02:11.943331 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e580f4eea5bb462108354a7a2650ea71c37b0f841b784ef54e88d9c4efb2f75\": container with ID starting with 3e580f4eea5bb462108354a7a2650ea71c37b0f841b784ef54e88d9c4efb2f75 not found: ID does not exist" containerID="3e580f4eea5bb462108354a7a2650ea71c37b0f841b784ef54e88d9c4efb2f75" Oct 01 11:02:11 crc kubenswrapper[4735]: I1001 11:02:11.943358 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e580f4eea5bb462108354a7a2650ea71c37b0f841b784ef54e88d9c4efb2f75"} err="failed to get container status \"3e580f4eea5bb462108354a7a2650ea71c37b0f841b784ef54e88d9c4efb2f75\": rpc error: code = NotFound desc = could not find container \"3e580f4eea5bb462108354a7a2650ea71c37b0f841b784ef54e88d9c4efb2f75\": container with ID starting with 3e580f4eea5bb462108354a7a2650ea71c37b0f841b784ef54e88d9c4efb2f75 not found: ID does not exist" Oct 01 11:02:35 crc kubenswrapper[4735]: I1001 11:02:35.485936 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:02:35 crc kubenswrapper[4735]: I1001 11:02:35.486443 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:03:05 crc kubenswrapper[4735]: I1001 11:03:05.485744 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:03:05 crc kubenswrapper[4735]: I1001 11:03:05.486356 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:03:35 crc kubenswrapper[4735]: I1001 11:03:35.485783 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:03:35 crc kubenswrapper[4735]: I1001 11:03:35.486665 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:03:35 crc kubenswrapper[4735]: I1001 11:03:35.486741 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 11:03:35 crc kubenswrapper[4735]: I1001 11:03:35.487886 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac9f83cabc5202081a1059b4f03ea2e0039b7add88f2cc679eb472a1c50e7db1"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:03:35 crc kubenswrapper[4735]: I1001 11:03:35.487973 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://ac9f83cabc5202081a1059b4f03ea2e0039b7add88f2cc679eb472a1c50e7db1" gracePeriod=600 Oct 01 11:03:35 crc kubenswrapper[4735]: I1001 11:03:35.770555 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"ac9f83cabc5202081a1059b4f03ea2e0039b7add88f2cc679eb472a1c50e7db1"} Oct 01 11:03:35 crc kubenswrapper[4735]: I1001 11:03:35.770934 4735 scope.go:117] "RemoveContainer" containerID="1032fe4bfe7b37437f955afc568b4ff5ec85395d8a3b1654729d50088e29791f" Oct 01 11:03:35 crc kubenswrapper[4735]: I1001 11:03:35.770580 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="ac9f83cabc5202081a1059b4f03ea2e0039b7add88f2cc679eb472a1c50e7db1" exitCode=0 Oct 01 11:03:36 crc kubenswrapper[4735]: I1001 11:03:36.783386 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24"} Oct 01 11:05:35 crc kubenswrapper[4735]: I1001 11:05:35.485797 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:05:35 crc kubenswrapper[4735]: I1001 11:05:35.486335 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:06:05 crc kubenswrapper[4735]: I1001 11:06:05.485266 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:06:05 crc kubenswrapper[4735]: I1001 11:06:05.486096 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:06:35 crc kubenswrapper[4735]: I1001 11:06:35.486187 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:06:35 crc kubenswrapper[4735]: I1001 11:06:35.486958 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:06:35 crc kubenswrapper[4735]: I1001 11:06:35.487023 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 11:06:35 crc kubenswrapper[4735]: I1001 11:06:35.488060 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:06:35 crc kubenswrapper[4735]: I1001 11:06:35.488150 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" gracePeriod=600 Oct 01 11:06:35 crc kubenswrapper[4735]: E1001 11:06:35.624059 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:06:35 crc kubenswrapper[4735]: I1001 11:06:35.726212 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" exitCode=0 Oct 01 11:06:35 crc kubenswrapper[4735]: I1001 11:06:35.726278 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24"} Oct 01 11:06:35 crc kubenswrapper[4735]: I1001 11:06:35.726402 4735 scope.go:117] "RemoveContainer" containerID="ac9f83cabc5202081a1059b4f03ea2e0039b7add88f2cc679eb472a1c50e7db1" Oct 01 11:06:35 crc kubenswrapper[4735]: I1001 11:06:35.727010 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:06:35 crc kubenswrapper[4735]: E1001 11:06:35.727449 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:06:47 crc kubenswrapper[4735]: I1001 11:06:47.897591 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:06:47 crc kubenswrapper[4735]: E1001 11:06:47.898678 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:07:01 crc kubenswrapper[4735]: I1001 11:07:01.917359 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:07:01 crc kubenswrapper[4735]: E1001 11:07:01.919121 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.734837 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wg6l9"] Oct 01 11:07:06 crc kubenswrapper[4735]: E1001 11:07:06.735578 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerName="extract-content" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.735592 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerName="extract-content" Oct 01 11:07:06 crc kubenswrapper[4735]: E1001 11:07:06.735617 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerName="extract-utilities" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.735623 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerName="extract-utilities" Oct 01 11:07:06 crc kubenswrapper[4735]: E1001 11:07:06.735631 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8079fe81-70d9-4714-952e-b6b37e6facb7" containerName="extract-content" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.735638 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8079fe81-70d9-4714-952e-b6b37e6facb7" containerName="extract-content" Oct 01 11:07:06 crc kubenswrapper[4735]: E1001 11:07:06.735648 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8079fe81-70d9-4714-952e-b6b37e6facb7" containerName="registry-server" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.735653 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8079fe81-70d9-4714-952e-b6b37e6facb7" containerName="registry-server" Oct 01 11:07:06 crc kubenswrapper[4735]: E1001 11:07:06.735665 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerName="registry-server" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.735670 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerName="registry-server" Oct 01 11:07:06 crc kubenswrapper[4735]: E1001 11:07:06.735684 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8079fe81-70d9-4714-952e-b6b37e6facb7" containerName="extract-utilities" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.735690 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8079fe81-70d9-4714-952e-b6b37e6facb7" containerName="extract-utilities" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.735871 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e0b139-dfc9-41fa-9594-7e71650e92c5" containerName="registry-server" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.735893 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8079fe81-70d9-4714-952e-b6b37e6facb7" containerName="registry-server" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.737359 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.743922 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wg6l9"] Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.822050 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-utilities\") pod \"redhat-marketplace-wg6l9\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.822107 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-catalog-content\") pod \"redhat-marketplace-wg6l9\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.822224 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kftjd\" (UniqueName: \"kubernetes.io/projected/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-kube-api-access-kftjd\") pod \"redhat-marketplace-wg6l9\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.925785 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kftjd\" (UniqueName: \"kubernetes.io/projected/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-kube-api-access-kftjd\") pod \"redhat-marketplace-wg6l9\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.926170 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-utilities\") pod \"redhat-marketplace-wg6l9\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.926243 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-catalog-content\") pod \"redhat-marketplace-wg6l9\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.929134 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-utilities\") pod \"redhat-marketplace-wg6l9\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.929178 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-catalog-content\") pod \"redhat-marketplace-wg6l9\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:06 crc kubenswrapper[4735]: I1001 11:07:06.952518 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kftjd\" (UniqueName: \"kubernetes.io/projected/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-kube-api-access-kftjd\") pod \"redhat-marketplace-wg6l9\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:07 crc kubenswrapper[4735]: I1001 11:07:07.057338 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:07 crc kubenswrapper[4735]: I1001 11:07:07.494749 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wg6l9"] Oct 01 11:07:08 crc kubenswrapper[4735]: I1001 11:07:08.069237 4735 generic.go:334] "Generic (PLEG): container finished" podID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" containerID="0f84c55118d54d6c47603233a4cf30cbf0b1244a1933a50756a4a1ae36080ddb" exitCode=0 Oct 01 11:07:08 crc kubenswrapper[4735]: I1001 11:07:08.069311 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wg6l9" event={"ID":"7cbf4311-6ba1-4806-8d10-aa4e0d34039f","Type":"ContainerDied","Data":"0f84c55118d54d6c47603233a4cf30cbf0b1244a1933a50756a4a1ae36080ddb"} Oct 01 11:07:08 crc kubenswrapper[4735]: I1001 11:07:08.069600 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wg6l9" event={"ID":"7cbf4311-6ba1-4806-8d10-aa4e0d34039f","Type":"ContainerStarted","Data":"9475c07c6b288beedddf7263b92c6c5fb64df0c184f8e9edd6152b6fd0da8276"} Oct 01 11:07:08 crc kubenswrapper[4735]: I1001 11:07:08.071618 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 11:07:09 crc kubenswrapper[4735]: I1001 11:07:09.079543 4735 generic.go:334] "Generic (PLEG): container finished" podID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" containerID="5154d351b28baf17dfc4fc52b22b80365f32b8459e4e64033c8e738a2cb7d045" exitCode=0 Oct 01 11:07:09 crc kubenswrapper[4735]: I1001 11:07:09.079665 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wg6l9" event={"ID":"7cbf4311-6ba1-4806-8d10-aa4e0d34039f","Type":"ContainerDied","Data":"5154d351b28baf17dfc4fc52b22b80365f32b8459e4e64033c8e738a2cb7d045"} Oct 01 11:07:10 crc kubenswrapper[4735]: I1001 11:07:10.091287 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wg6l9" event={"ID":"7cbf4311-6ba1-4806-8d10-aa4e0d34039f","Type":"ContainerStarted","Data":"6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2"} Oct 01 11:07:10 crc kubenswrapper[4735]: I1001 11:07:10.110567 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wg6l9" podStartSLOduration=2.634584287 podStartE2EDuration="4.110546998s" podCreationTimestamp="2025-10-01 11:07:06 +0000 UTC" firstStartedPulling="2025-10-01 11:07:08.071186578 +0000 UTC m=+2986.764007850" lastFinishedPulling="2025-10-01 11:07:09.547149289 +0000 UTC m=+2988.239970561" observedRunningTime="2025-10-01 11:07:10.106695074 +0000 UTC m=+2988.799516336" watchObservedRunningTime="2025-10-01 11:07:10.110546998 +0000 UTC m=+2988.803368270" Oct 01 11:07:12 crc kubenswrapper[4735]: I1001 11:07:12.897365 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:07:12 crc kubenswrapper[4735]: E1001 11:07:12.898187 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:07:17 crc kubenswrapper[4735]: I1001 11:07:17.058350 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:17 crc kubenswrapper[4735]: I1001 11:07:17.058627 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:17 crc kubenswrapper[4735]: I1001 11:07:17.110251 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:17 crc kubenswrapper[4735]: I1001 11:07:17.234980 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:17 crc kubenswrapper[4735]: I1001 11:07:17.350618 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wg6l9"] Oct 01 11:07:19 crc kubenswrapper[4735]: I1001 11:07:19.183292 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wg6l9" podUID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" containerName="registry-server" containerID="cri-o://6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2" gracePeriod=2 Oct 01 11:07:19 crc kubenswrapper[4735]: I1001 11:07:19.727440 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:19 crc kubenswrapper[4735]: I1001 11:07:19.907450 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-utilities\") pod \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " Oct 01 11:07:19 crc kubenswrapper[4735]: I1001 11:07:19.907552 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-catalog-content\") pod \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " Oct 01 11:07:19 crc kubenswrapper[4735]: I1001 11:07:19.907643 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kftjd\" (UniqueName: \"kubernetes.io/projected/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-kube-api-access-kftjd\") pod \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\" (UID: \"7cbf4311-6ba1-4806-8d10-aa4e0d34039f\") " Oct 01 11:07:19 crc kubenswrapper[4735]: I1001 11:07:19.908612 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-utilities" (OuterVolumeSpecName: "utilities") pod "7cbf4311-6ba1-4806-8d10-aa4e0d34039f" (UID: "7cbf4311-6ba1-4806-8d10-aa4e0d34039f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:07:19 crc kubenswrapper[4735]: I1001 11:07:19.913521 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-kube-api-access-kftjd" (OuterVolumeSpecName: "kube-api-access-kftjd") pod "7cbf4311-6ba1-4806-8d10-aa4e0d34039f" (UID: "7cbf4311-6ba1-4806-8d10-aa4e0d34039f"). InnerVolumeSpecName "kube-api-access-kftjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:07:19 crc kubenswrapper[4735]: I1001 11:07:19.926964 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cbf4311-6ba1-4806-8d10-aa4e0d34039f" (UID: "7cbf4311-6ba1-4806-8d10-aa4e0d34039f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.010528 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.011460 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.011538 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kftjd\" (UniqueName: \"kubernetes.io/projected/7cbf4311-6ba1-4806-8d10-aa4e0d34039f-kube-api-access-kftjd\") on node \"crc\" DevicePath \"\"" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.194087 4735 generic.go:334] "Generic (PLEG): container finished" podID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" containerID="6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2" exitCode=0 Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.194125 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wg6l9" event={"ID":"7cbf4311-6ba1-4806-8d10-aa4e0d34039f","Type":"ContainerDied","Data":"6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2"} Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.194153 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wg6l9" event={"ID":"7cbf4311-6ba1-4806-8d10-aa4e0d34039f","Type":"ContainerDied","Data":"9475c07c6b288beedddf7263b92c6c5fb64df0c184f8e9edd6152b6fd0da8276"} Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.194171 4735 scope.go:117] "RemoveContainer" containerID="6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.194290 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wg6l9" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.227360 4735 scope.go:117] "RemoveContainer" containerID="5154d351b28baf17dfc4fc52b22b80365f32b8459e4e64033c8e738a2cb7d045" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.236722 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wg6l9"] Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.243930 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wg6l9"] Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.266667 4735 scope.go:117] "RemoveContainer" containerID="0f84c55118d54d6c47603233a4cf30cbf0b1244a1933a50756a4a1ae36080ddb" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.297552 4735 scope.go:117] "RemoveContainer" containerID="6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2" Oct 01 11:07:20 crc kubenswrapper[4735]: E1001 11:07:20.298047 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2\": container with ID starting with 6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2 not found: ID does not exist" containerID="6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.298078 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2"} err="failed to get container status \"6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2\": rpc error: code = NotFound desc = could not find container \"6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2\": container with ID starting with 6298f069b9de6350201de87e3d6fb974adfedc76954cd0984e0d74b6001e94a2 not found: ID does not exist" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.298096 4735 scope.go:117] "RemoveContainer" containerID="5154d351b28baf17dfc4fc52b22b80365f32b8459e4e64033c8e738a2cb7d045" Oct 01 11:07:20 crc kubenswrapper[4735]: E1001 11:07:20.298487 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5154d351b28baf17dfc4fc52b22b80365f32b8459e4e64033c8e738a2cb7d045\": container with ID starting with 5154d351b28baf17dfc4fc52b22b80365f32b8459e4e64033c8e738a2cb7d045 not found: ID does not exist" containerID="5154d351b28baf17dfc4fc52b22b80365f32b8459e4e64033c8e738a2cb7d045" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.298666 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5154d351b28baf17dfc4fc52b22b80365f32b8459e4e64033c8e738a2cb7d045"} err="failed to get container status \"5154d351b28baf17dfc4fc52b22b80365f32b8459e4e64033c8e738a2cb7d045\": rpc error: code = NotFound desc = could not find container \"5154d351b28baf17dfc4fc52b22b80365f32b8459e4e64033c8e738a2cb7d045\": container with ID starting with 5154d351b28baf17dfc4fc52b22b80365f32b8459e4e64033c8e738a2cb7d045 not found: ID does not exist" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.298726 4735 scope.go:117] "RemoveContainer" containerID="0f84c55118d54d6c47603233a4cf30cbf0b1244a1933a50756a4a1ae36080ddb" Oct 01 11:07:20 crc kubenswrapper[4735]: E1001 11:07:20.299156 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f84c55118d54d6c47603233a4cf30cbf0b1244a1933a50756a4a1ae36080ddb\": container with ID starting with 0f84c55118d54d6c47603233a4cf30cbf0b1244a1933a50756a4a1ae36080ddb not found: ID does not exist" containerID="0f84c55118d54d6c47603233a4cf30cbf0b1244a1933a50756a4a1ae36080ddb" Oct 01 11:07:20 crc kubenswrapper[4735]: I1001 11:07:20.299220 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f84c55118d54d6c47603233a4cf30cbf0b1244a1933a50756a4a1ae36080ddb"} err="failed to get container status \"0f84c55118d54d6c47603233a4cf30cbf0b1244a1933a50756a4a1ae36080ddb\": rpc error: code = NotFound desc = could not find container \"0f84c55118d54d6c47603233a4cf30cbf0b1244a1933a50756a4a1ae36080ddb\": container with ID starting with 0f84c55118d54d6c47603233a4cf30cbf0b1244a1933a50756a4a1ae36080ddb not found: ID does not exist" Oct 01 11:07:21 crc kubenswrapper[4735]: I1001 11:07:21.920096 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" path="/var/lib/kubelet/pods/7cbf4311-6ba1-4806-8d10-aa4e0d34039f/volumes" Oct 01 11:07:26 crc kubenswrapper[4735]: I1001 11:07:26.897896 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:07:26 crc kubenswrapper[4735]: E1001 11:07:26.899468 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:07:38 crc kubenswrapper[4735]: I1001 11:07:38.897978 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:07:38 crc kubenswrapper[4735]: E1001 11:07:38.901088 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:07:51 crc kubenswrapper[4735]: I1001 11:07:51.905381 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:07:51 crc kubenswrapper[4735]: E1001 11:07:51.906352 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:08:02 crc kubenswrapper[4735]: I1001 11:08:02.896700 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:08:02 crc kubenswrapper[4735]: E1001 11:08:02.897621 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:08:15 crc kubenswrapper[4735]: I1001 11:08:15.899186 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:08:15 crc kubenswrapper[4735]: E1001 11:08:15.900726 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:08:29 crc kubenswrapper[4735]: I1001 11:08:29.897176 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:08:29 crc kubenswrapper[4735]: E1001 11:08:29.897758 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:08:43 crc kubenswrapper[4735]: I1001 11:08:43.897781 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:08:43 crc kubenswrapper[4735]: E1001 11:08:43.899654 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:08:57 crc kubenswrapper[4735]: I1001 11:08:57.897973 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:08:57 crc kubenswrapper[4735]: E1001 11:08:57.898991 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:09:11 crc kubenswrapper[4735]: I1001 11:09:11.910442 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:09:11 crc kubenswrapper[4735]: E1001 11:09:11.911314 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:09:22 crc kubenswrapper[4735]: I1001 11:09:22.897155 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:09:22 crc kubenswrapper[4735]: E1001 11:09:22.897999 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:09:33 crc kubenswrapper[4735]: I1001 11:09:33.897924 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:09:33 crc kubenswrapper[4735]: E1001 11:09:33.898718 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:09:48 crc kubenswrapper[4735]: I1001 11:09:48.897711 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:09:48 crc kubenswrapper[4735]: E1001 11:09:48.898868 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:10:00 crc kubenswrapper[4735]: I1001 11:10:00.897603 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:10:00 crc kubenswrapper[4735]: E1001 11:10:00.898537 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:10:15 crc kubenswrapper[4735]: I1001 11:10:15.898290 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:10:15 crc kubenswrapper[4735]: E1001 11:10:15.899398 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:10:29 crc kubenswrapper[4735]: I1001 11:10:29.897638 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:10:29 crc kubenswrapper[4735]: E1001 11:10:29.898528 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:10:43 crc kubenswrapper[4735]: I1001 11:10:43.898314 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:10:43 crc kubenswrapper[4735]: E1001 11:10:43.899073 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:10:58 crc kubenswrapper[4735]: I1001 11:10:58.897854 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:10:58 crc kubenswrapper[4735]: E1001 11:10:58.898630 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:11:09 crc kubenswrapper[4735]: I1001 11:11:09.897678 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:11:09 crc kubenswrapper[4735]: E1001 11:11:09.898593 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:11:22 crc kubenswrapper[4735]: I1001 11:11:22.897736 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:11:22 crc kubenswrapper[4735]: E1001 11:11:22.898982 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:11:36 crc kubenswrapper[4735]: I1001 11:11:36.897077 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:11:37 crc kubenswrapper[4735]: I1001 11:11:37.891043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"7c0fff2ad8c11c90072972b08d9ebb58c252453d2fef2077d87db58eb0f716d0"} Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.017568 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pvqbj"] Oct 01 11:11:38 crc kubenswrapper[4735]: E1001 11:11:38.018366 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" containerName="extract-content" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.018382 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" containerName="extract-content" Oct 01 11:11:38 crc kubenswrapper[4735]: E1001 11:11:38.018393 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" containerName="extract-utilities" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.018400 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" containerName="extract-utilities" Oct 01 11:11:38 crc kubenswrapper[4735]: E1001 11:11:38.018426 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" containerName="registry-server" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.018433 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" containerName="registry-server" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.018685 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cbf4311-6ba1-4806-8d10-aa4e0d34039f" containerName="registry-server" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.020167 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.030096 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pvqbj"] Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.133342 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-catalog-content\") pod \"community-operators-pvqbj\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.133727 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ndl9\" (UniqueName: \"kubernetes.io/projected/edcea167-4fae-4818-9440-19bf5a19261d-kube-api-access-4ndl9\") pod \"community-operators-pvqbj\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.134083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-utilities\") pod \"community-operators-pvqbj\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.236540 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-utilities\") pod \"community-operators-pvqbj\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.236691 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-catalog-content\") pod \"community-operators-pvqbj\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.236769 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ndl9\" (UniqueName: \"kubernetes.io/projected/edcea167-4fae-4818-9440-19bf5a19261d-kube-api-access-4ndl9\") pod \"community-operators-pvqbj\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.237336 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-utilities\") pod \"community-operators-pvqbj\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.237701 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-catalog-content\") pod \"community-operators-pvqbj\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.257534 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ndl9\" (UniqueName: \"kubernetes.io/projected/edcea167-4fae-4818-9440-19bf5a19261d-kube-api-access-4ndl9\") pod \"community-operators-pvqbj\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.339819 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.847936 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pvqbj"] Oct 01 11:11:38 crc kubenswrapper[4735]: I1001 11:11:38.904859 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvqbj" event={"ID":"edcea167-4fae-4818-9440-19bf5a19261d","Type":"ContainerStarted","Data":"8fd2b4ed52fa3d09ae68fb8be1c55e21a2340ca2768d9d25f4b9044bcb53cc1c"} Oct 01 11:11:39 crc kubenswrapper[4735]: I1001 11:11:39.922679 4735 generic.go:334] "Generic (PLEG): container finished" podID="edcea167-4fae-4818-9440-19bf5a19261d" containerID="e36c65dcf26f87b5cc2a5fb004f1ab620004d5cd856494e593d31738ede4d541" exitCode=0 Oct 01 11:11:39 crc kubenswrapper[4735]: I1001 11:11:39.923071 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvqbj" event={"ID":"edcea167-4fae-4818-9440-19bf5a19261d","Type":"ContainerDied","Data":"e36c65dcf26f87b5cc2a5fb004f1ab620004d5cd856494e593d31738ede4d541"} Oct 01 11:11:41 crc kubenswrapper[4735]: I1001 11:11:41.946859 4735 generic.go:334] "Generic (PLEG): container finished" podID="edcea167-4fae-4818-9440-19bf5a19261d" containerID="a8e701dd703911f3a1d05635eaf25444e98ed301f78508e7d5afd79767c63012" exitCode=0 Oct 01 11:11:41 crc kubenswrapper[4735]: I1001 11:11:41.947196 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvqbj" event={"ID":"edcea167-4fae-4818-9440-19bf5a19261d","Type":"ContainerDied","Data":"a8e701dd703911f3a1d05635eaf25444e98ed301f78508e7d5afd79767c63012"} Oct 01 11:11:43 crc kubenswrapper[4735]: I1001 11:11:43.968383 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvqbj" event={"ID":"edcea167-4fae-4818-9440-19bf5a19261d","Type":"ContainerStarted","Data":"27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b"} Oct 01 11:11:43 crc kubenswrapper[4735]: I1001 11:11:43.993103 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pvqbj" podStartSLOduration=3.231016162 podStartE2EDuration="6.993085381s" podCreationTimestamp="2025-10-01 11:11:37 +0000 UTC" firstStartedPulling="2025-10-01 11:11:39.927570005 +0000 UTC m=+3258.620391277" lastFinishedPulling="2025-10-01 11:11:43.689639194 +0000 UTC m=+3262.382460496" observedRunningTime="2025-10-01 11:11:43.990960224 +0000 UTC m=+3262.683781496" watchObservedRunningTime="2025-10-01 11:11:43.993085381 +0000 UTC m=+3262.685906643" Oct 01 11:11:48 crc kubenswrapper[4735]: I1001 11:11:48.340844 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:48 crc kubenswrapper[4735]: I1001 11:11:48.341203 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:48 crc kubenswrapper[4735]: I1001 11:11:48.417033 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:49 crc kubenswrapper[4735]: I1001 11:11:49.089747 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:49 crc kubenswrapper[4735]: I1001 11:11:49.140205 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pvqbj"] Oct 01 11:11:51 crc kubenswrapper[4735]: I1001 11:11:51.042997 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pvqbj" podUID="edcea167-4fae-4818-9440-19bf5a19261d" containerName="registry-server" containerID="cri-o://27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b" gracePeriod=2 Oct 01 11:11:51 crc kubenswrapper[4735]: I1001 11:11:51.649208 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:51 crc kubenswrapper[4735]: I1001 11:11:51.720747 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-catalog-content\") pod \"edcea167-4fae-4818-9440-19bf5a19261d\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " Oct 01 11:11:51 crc kubenswrapper[4735]: I1001 11:11:51.720851 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ndl9\" (UniqueName: \"kubernetes.io/projected/edcea167-4fae-4818-9440-19bf5a19261d-kube-api-access-4ndl9\") pod \"edcea167-4fae-4818-9440-19bf5a19261d\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " Oct 01 11:11:51 crc kubenswrapper[4735]: I1001 11:11:51.720941 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-utilities\") pod \"edcea167-4fae-4818-9440-19bf5a19261d\" (UID: \"edcea167-4fae-4818-9440-19bf5a19261d\") " Oct 01 11:11:51 crc kubenswrapper[4735]: I1001 11:11:51.722700 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-utilities" (OuterVolumeSpecName: "utilities") pod "edcea167-4fae-4818-9440-19bf5a19261d" (UID: "edcea167-4fae-4818-9440-19bf5a19261d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:11:51 crc kubenswrapper[4735]: I1001 11:11:51.728970 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edcea167-4fae-4818-9440-19bf5a19261d-kube-api-access-4ndl9" (OuterVolumeSpecName: "kube-api-access-4ndl9") pod "edcea167-4fae-4818-9440-19bf5a19261d" (UID: "edcea167-4fae-4818-9440-19bf5a19261d"). InnerVolumeSpecName "kube-api-access-4ndl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:11:51 crc kubenswrapper[4735]: I1001 11:11:51.823784 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ndl9\" (UniqueName: \"kubernetes.io/projected/edcea167-4fae-4818-9440-19bf5a19261d-kube-api-access-4ndl9\") on node \"crc\" DevicePath \"\"" Oct 01 11:11:51 crc kubenswrapper[4735]: I1001 11:11:51.823827 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.057273 4735 generic.go:334] "Generic (PLEG): container finished" podID="edcea167-4fae-4818-9440-19bf5a19261d" containerID="27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b" exitCode=0 Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.057853 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pvqbj" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.057821 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvqbj" event={"ID":"edcea167-4fae-4818-9440-19bf5a19261d","Type":"ContainerDied","Data":"27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b"} Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.058945 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pvqbj" event={"ID":"edcea167-4fae-4818-9440-19bf5a19261d","Type":"ContainerDied","Data":"8fd2b4ed52fa3d09ae68fb8be1c55e21a2340ca2768d9d25f4b9044bcb53cc1c"} Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.058984 4735 scope.go:117] "RemoveContainer" containerID="27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.097932 4735 scope.go:117] "RemoveContainer" containerID="a8e701dd703911f3a1d05635eaf25444e98ed301f78508e7d5afd79767c63012" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.136633 4735 scope.go:117] "RemoveContainer" containerID="e36c65dcf26f87b5cc2a5fb004f1ab620004d5cd856494e593d31738ede4d541" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.204304 4735 scope.go:117] "RemoveContainer" containerID="27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b" Oct 01 11:11:52 crc kubenswrapper[4735]: E1001 11:11:52.204906 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b\": container with ID starting with 27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b not found: ID does not exist" containerID="27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.204945 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b"} err="failed to get container status \"27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b\": rpc error: code = NotFound desc = could not find container \"27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b\": container with ID starting with 27a954e13c2d6fd1c5c566ab23946c5db7ccfacf02a441d3a26092a792fd041b not found: ID does not exist" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.204972 4735 scope.go:117] "RemoveContainer" containerID="a8e701dd703911f3a1d05635eaf25444e98ed301f78508e7d5afd79767c63012" Oct 01 11:11:52 crc kubenswrapper[4735]: E1001 11:11:52.205428 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e701dd703911f3a1d05635eaf25444e98ed301f78508e7d5afd79767c63012\": container with ID starting with a8e701dd703911f3a1d05635eaf25444e98ed301f78508e7d5afd79767c63012 not found: ID does not exist" containerID="a8e701dd703911f3a1d05635eaf25444e98ed301f78508e7d5afd79767c63012" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.205484 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e701dd703911f3a1d05635eaf25444e98ed301f78508e7d5afd79767c63012"} err="failed to get container status \"a8e701dd703911f3a1d05635eaf25444e98ed301f78508e7d5afd79767c63012\": rpc error: code = NotFound desc = could not find container \"a8e701dd703911f3a1d05635eaf25444e98ed301f78508e7d5afd79767c63012\": container with ID starting with a8e701dd703911f3a1d05635eaf25444e98ed301f78508e7d5afd79767c63012 not found: ID does not exist" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.205535 4735 scope.go:117] "RemoveContainer" containerID="e36c65dcf26f87b5cc2a5fb004f1ab620004d5cd856494e593d31738ede4d541" Oct 01 11:11:52 crc kubenswrapper[4735]: E1001 11:11:52.205886 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36c65dcf26f87b5cc2a5fb004f1ab620004d5cd856494e593d31738ede4d541\": container with ID starting with e36c65dcf26f87b5cc2a5fb004f1ab620004d5cd856494e593d31738ede4d541 not found: ID does not exist" containerID="e36c65dcf26f87b5cc2a5fb004f1ab620004d5cd856494e593d31738ede4d541" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.205920 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36c65dcf26f87b5cc2a5fb004f1ab620004d5cd856494e593d31738ede4d541"} err="failed to get container status \"e36c65dcf26f87b5cc2a5fb004f1ab620004d5cd856494e593d31738ede4d541\": rpc error: code = NotFound desc = could not find container \"e36c65dcf26f87b5cc2a5fb004f1ab620004d5cd856494e593d31738ede4d541\": container with ID starting with e36c65dcf26f87b5cc2a5fb004f1ab620004d5cd856494e593d31738ede4d541 not found: ID does not exist" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.340145 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edcea167-4fae-4818-9440-19bf5a19261d" (UID: "edcea167-4fae-4818-9440-19bf5a19261d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.341887 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcea167-4fae-4818-9440-19bf5a19261d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.414981 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pvqbj"] Oct 01 11:11:52 crc kubenswrapper[4735]: I1001 11:11:52.429006 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pvqbj"] Oct 01 11:11:53 crc kubenswrapper[4735]: I1001 11:11:53.939022 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edcea167-4fae-4818-9440-19bf5a19261d" path="/var/lib/kubelet/pods/edcea167-4fae-4818-9440-19bf5a19261d/volumes" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.595322 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cxd9s"] Oct 01 11:12:23 crc kubenswrapper[4735]: E1001 11:12:23.596452 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edcea167-4fae-4818-9440-19bf5a19261d" containerName="extract-utilities" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.596472 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="edcea167-4fae-4818-9440-19bf5a19261d" containerName="extract-utilities" Oct 01 11:12:23 crc kubenswrapper[4735]: E1001 11:12:23.596516 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edcea167-4fae-4818-9440-19bf5a19261d" containerName="extract-content" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.596527 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="edcea167-4fae-4818-9440-19bf5a19261d" containerName="extract-content" Oct 01 11:12:23 crc kubenswrapper[4735]: E1001 11:12:23.596551 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edcea167-4fae-4818-9440-19bf5a19261d" containerName="registry-server" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.596559 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="edcea167-4fae-4818-9440-19bf5a19261d" containerName="registry-server" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.596782 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="edcea167-4fae-4818-9440-19bf5a19261d" containerName="registry-server" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.598798 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.609374 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxd9s"] Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.660339 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-catalog-content\") pod \"redhat-operators-cxd9s\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.660489 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkl2c\" (UniqueName: \"kubernetes.io/projected/06232b3f-0241-45ed-9267-c41302d3ce83-kube-api-access-gkl2c\") pod \"redhat-operators-cxd9s\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.660957 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-utilities\") pod \"redhat-operators-cxd9s\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.763790 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-catalog-content\") pod \"redhat-operators-cxd9s\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.763854 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkl2c\" (UniqueName: \"kubernetes.io/projected/06232b3f-0241-45ed-9267-c41302d3ce83-kube-api-access-gkl2c\") pod \"redhat-operators-cxd9s\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.763888 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-utilities\") pod \"redhat-operators-cxd9s\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.764410 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-catalog-content\") pod \"redhat-operators-cxd9s\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.764788 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-utilities\") pod \"redhat-operators-cxd9s\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.794822 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkl2c\" (UniqueName: \"kubernetes.io/projected/06232b3f-0241-45ed-9267-c41302d3ce83-kube-api-access-gkl2c\") pod \"redhat-operators-cxd9s\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:23 crc kubenswrapper[4735]: I1001 11:12:23.929461 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:24 crc kubenswrapper[4735]: I1001 11:12:24.446457 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cxd9s"] Oct 01 11:12:25 crc kubenswrapper[4735]: I1001 11:12:25.409469 4735 generic.go:334] "Generic (PLEG): container finished" podID="06232b3f-0241-45ed-9267-c41302d3ce83" containerID="d0dd8ffb1b63c3e3da7b55e682b2381a1a01ab997e48d19c18b3d1f10bd791fa" exitCode=0 Oct 01 11:12:25 crc kubenswrapper[4735]: I1001 11:12:25.409543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxd9s" event={"ID":"06232b3f-0241-45ed-9267-c41302d3ce83","Type":"ContainerDied","Data":"d0dd8ffb1b63c3e3da7b55e682b2381a1a01ab997e48d19c18b3d1f10bd791fa"} Oct 01 11:12:25 crc kubenswrapper[4735]: I1001 11:12:25.409880 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxd9s" event={"ID":"06232b3f-0241-45ed-9267-c41302d3ce83","Type":"ContainerStarted","Data":"991fb549184d00d971e026cbe86f7f421bca47c66eb8dd36151aaaaeca0e10a4"} Oct 01 11:12:25 crc kubenswrapper[4735]: I1001 11:12:25.415151 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 11:12:26 crc kubenswrapper[4735]: I1001 11:12:26.422204 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxd9s" event={"ID":"06232b3f-0241-45ed-9267-c41302d3ce83","Type":"ContainerStarted","Data":"2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f"} Oct 01 11:12:27 crc kubenswrapper[4735]: I1001 11:12:27.438384 4735 generic.go:334] "Generic (PLEG): container finished" podID="06232b3f-0241-45ed-9267-c41302d3ce83" containerID="2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f" exitCode=0 Oct 01 11:12:27 crc kubenswrapper[4735]: I1001 11:12:27.439374 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxd9s" event={"ID":"06232b3f-0241-45ed-9267-c41302d3ce83","Type":"ContainerDied","Data":"2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f"} Oct 01 11:12:30 crc kubenswrapper[4735]: I1001 11:12:30.473164 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxd9s" event={"ID":"06232b3f-0241-45ed-9267-c41302d3ce83","Type":"ContainerStarted","Data":"a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e"} Oct 01 11:12:30 crc kubenswrapper[4735]: I1001 11:12:30.514179 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cxd9s" podStartSLOduration=3.595310257 podStartE2EDuration="7.514151401s" podCreationTimestamp="2025-10-01 11:12:23 +0000 UTC" firstStartedPulling="2025-10-01 11:12:25.41485373 +0000 UTC m=+3304.107675002" lastFinishedPulling="2025-10-01 11:12:29.333694884 +0000 UTC m=+3308.026516146" observedRunningTime="2025-10-01 11:12:30.508724295 +0000 UTC m=+3309.201545597" watchObservedRunningTime="2025-10-01 11:12:30.514151401 +0000 UTC m=+3309.206972673" Oct 01 11:12:33 crc kubenswrapper[4735]: I1001 11:12:33.930649 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:33 crc kubenswrapper[4735]: I1001 11:12:33.930935 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:34 crc kubenswrapper[4735]: I1001 11:12:34.986767 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cxd9s" podUID="06232b3f-0241-45ed-9267-c41302d3ce83" containerName="registry-server" probeResult="failure" output=< Oct 01 11:12:34 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Oct 01 11:12:34 crc kubenswrapper[4735]: > Oct 01 11:12:44 crc kubenswrapper[4735]: I1001 11:12:44.003938 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:44 crc kubenswrapper[4735]: I1001 11:12:44.059194 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:44 crc kubenswrapper[4735]: I1001 11:12:44.242117 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxd9s"] Oct 01 11:12:45 crc kubenswrapper[4735]: I1001 11:12:45.650442 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cxd9s" podUID="06232b3f-0241-45ed-9267-c41302d3ce83" containerName="registry-server" containerID="cri-o://a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e" gracePeriod=2 Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.219483 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.391653 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-catalog-content\") pod \"06232b3f-0241-45ed-9267-c41302d3ce83\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.392065 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-utilities\") pod \"06232b3f-0241-45ed-9267-c41302d3ce83\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.392380 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkl2c\" (UniqueName: \"kubernetes.io/projected/06232b3f-0241-45ed-9267-c41302d3ce83-kube-api-access-gkl2c\") pod \"06232b3f-0241-45ed-9267-c41302d3ce83\" (UID: \"06232b3f-0241-45ed-9267-c41302d3ce83\") " Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.409770 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-utilities" (OuterVolumeSpecName: "utilities") pod "06232b3f-0241-45ed-9267-c41302d3ce83" (UID: "06232b3f-0241-45ed-9267-c41302d3ce83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.418857 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06232b3f-0241-45ed-9267-c41302d3ce83-kube-api-access-gkl2c" (OuterVolumeSpecName: "kube-api-access-gkl2c") pod "06232b3f-0241-45ed-9267-c41302d3ce83" (UID: "06232b3f-0241-45ed-9267-c41302d3ce83"). InnerVolumeSpecName "kube-api-access-gkl2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.494245 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06232b3f-0241-45ed-9267-c41302d3ce83" (UID: "06232b3f-0241-45ed-9267-c41302d3ce83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.495294 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkl2c\" (UniqueName: \"kubernetes.io/projected/06232b3f-0241-45ed-9267-c41302d3ce83-kube-api-access-gkl2c\") on node \"crc\" DevicePath \"\"" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.495334 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.495346 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06232b3f-0241-45ed-9267-c41302d3ce83-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.661404 4735 generic.go:334] "Generic (PLEG): container finished" podID="06232b3f-0241-45ed-9267-c41302d3ce83" containerID="a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e" exitCode=0 Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.661445 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxd9s" event={"ID":"06232b3f-0241-45ed-9267-c41302d3ce83","Type":"ContainerDied","Data":"a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e"} Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.661472 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cxd9s" event={"ID":"06232b3f-0241-45ed-9267-c41302d3ce83","Type":"ContainerDied","Data":"991fb549184d00d971e026cbe86f7f421bca47c66eb8dd36151aaaaeca0e10a4"} Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.661510 4735 scope.go:117] "RemoveContainer" containerID="a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.661516 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cxd9s" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.680570 4735 scope.go:117] "RemoveContainer" containerID="2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.704859 4735 scope.go:117] "RemoveContainer" containerID="d0dd8ffb1b63c3e3da7b55e682b2381a1a01ab997e48d19c18b3d1f10bd791fa" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.759034 4735 scope.go:117] "RemoveContainer" containerID="a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.759517 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cxd9s"] Oct 01 11:12:46 crc kubenswrapper[4735]: E1001 11:12:46.759822 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e\": container with ID starting with a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e not found: ID does not exist" containerID="a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.759868 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e"} err="failed to get container status \"a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e\": rpc error: code = NotFound desc = could not find container \"a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e\": container with ID starting with a38542e0f6878350539f0fe5d8b9c3db1423d68db90e838aacd5997686d9287e not found: ID does not exist" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.759903 4735 scope.go:117] "RemoveContainer" containerID="2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f" Oct 01 11:12:46 crc kubenswrapper[4735]: E1001 11:12:46.760184 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f\": container with ID starting with 2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f not found: ID does not exist" containerID="2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.760222 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f"} err="failed to get container status \"2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f\": rpc error: code = NotFound desc = could not find container \"2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f\": container with ID starting with 2975ca5479617c1e5dde995bf95fcf8ac625bfd03b2c72dc10e34480b7c0e65f not found: ID does not exist" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.760246 4735 scope.go:117] "RemoveContainer" containerID="d0dd8ffb1b63c3e3da7b55e682b2381a1a01ab997e48d19c18b3d1f10bd791fa" Oct 01 11:12:46 crc kubenswrapper[4735]: E1001 11:12:46.760693 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0dd8ffb1b63c3e3da7b55e682b2381a1a01ab997e48d19c18b3d1f10bd791fa\": container with ID starting with d0dd8ffb1b63c3e3da7b55e682b2381a1a01ab997e48d19c18b3d1f10bd791fa not found: ID does not exist" containerID="d0dd8ffb1b63c3e3da7b55e682b2381a1a01ab997e48d19c18b3d1f10bd791fa" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.760752 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0dd8ffb1b63c3e3da7b55e682b2381a1a01ab997e48d19c18b3d1f10bd791fa"} err="failed to get container status \"d0dd8ffb1b63c3e3da7b55e682b2381a1a01ab997e48d19c18b3d1f10bd791fa\": rpc error: code = NotFound desc = could not find container \"d0dd8ffb1b63c3e3da7b55e682b2381a1a01ab997e48d19c18b3d1f10bd791fa\": container with ID starting with d0dd8ffb1b63c3e3da7b55e682b2381a1a01ab997e48d19c18b3d1f10bd791fa not found: ID does not exist" Oct 01 11:12:46 crc kubenswrapper[4735]: I1001 11:12:46.778896 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cxd9s"] Oct 01 11:12:47 crc kubenswrapper[4735]: I1001 11:12:47.911080 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06232b3f-0241-45ed-9267-c41302d3ce83" path="/var/lib/kubelet/pods/06232b3f-0241-45ed-9267-c41302d3ce83/volumes" Oct 01 11:13:23 crc kubenswrapper[4735]: I1001 11:13:23.063521 4735 generic.go:334] "Generic (PLEG): container finished" podID="66ee97cc-ac56-4879-b605-e2a9347213ca" containerID="c66c2a90fcf19c0c94e9ab286bd40fbe3f728bc2c931291136def5027cdcae1b" exitCode=0 Oct 01 11:13:23 crc kubenswrapper[4735]: I1001 11:13:23.063737 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66ee97cc-ac56-4879-b605-e2a9347213ca","Type":"ContainerDied","Data":"c66c2a90fcf19c0c94e9ab286bd40fbe3f728bc2c931291136def5027cdcae1b"} Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.357749 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.409938 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-config-data\") pod \"66ee97cc-ac56-4879-b605-e2a9347213ca\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.410029 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-temporary\") pod \"66ee97cc-ac56-4879-b605-e2a9347213ca\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.410062 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config-secret\") pod \"66ee97cc-ac56-4879-b605-e2a9347213ca\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.410171 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kmxl\" (UniqueName: \"kubernetes.io/projected/66ee97cc-ac56-4879-b605-e2a9347213ca-kube-api-access-5kmxl\") pod \"66ee97cc-ac56-4879-b605-e2a9347213ca\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.410252 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-workdir\") pod \"66ee97cc-ac56-4879-b605-e2a9347213ca\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.410765 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "66ee97cc-ac56-4879-b605-e2a9347213ca" (UID: "66ee97cc-ac56-4879-b605-e2a9347213ca"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.411333 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-config-data" (OuterVolumeSpecName: "config-data") pod "66ee97cc-ac56-4879-b605-e2a9347213ca" (UID: "66ee97cc-ac56-4879-b605-e2a9347213ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.415822 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ca-certs\") pod \"66ee97cc-ac56-4879-b605-e2a9347213ca\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.415937 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ssh-key\") pod \"66ee97cc-ac56-4879-b605-e2a9347213ca\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.415994 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"66ee97cc-ac56-4879-b605-e2a9347213ca\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.416078 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config\") pod \"66ee97cc-ac56-4879-b605-e2a9347213ca\" (UID: \"66ee97cc-ac56-4879-b605-e2a9347213ca\") " Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.416949 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ee97cc-ac56-4879-b605-e2a9347213ca-kube-api-access-5kmxl" (OuterVolumeSpecName: "kube-api-access-5kmxl") pod "66ee97cc-ac56-4879-b605-e2a9347213ca" (UID: "66ee97cc-ac56-4879-b605-e2a9347213ca"). InnerVolumeSpecName "kube-api-access-5kmxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.420383 4735 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.420560 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kmxl\" (UniqueName: \"kubernetes.io/projected/66ee97cc-ac56-4879-b605-e2a9347213ca-kube-api-access-5kmxl\") on node \"crc\" DevicePath \"\"" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.420583 4735 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.423945 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "66ee97cc-ac56-4879-b605-e2a9347213ca" (UID: "66ee97cc-ac56-4879-b605-e2a9347213ca"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.424511 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "66ee97cc-ac56-4879-b605-e2a9347213ca" (UID: "66ee97cc-ac56-4879-b605-e2a9347213ca"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.451537 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "66ee97cc-ac56-4879-b605-e2a9347213ca" (UID: "66ee97cc-ac56-4879-b605-e2a9347213ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.451562 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "66ee97cc-ac56-4879-b605-e2a9347213ca" (UID: "66ee97cc-ac56-4879-b605-e2a9347213ca"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.455749 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "66ee97cc-ac56-4879-b605-e2a9347213ca" (UID: "66ee97cc-ac56-4879-b605-e2a9347213ca"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.482409 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "66ee97cc-ac56-4879-b605-e2a9347213ca" (UID: "66ee97cc-ac56-4879-b605-e2a9347213ca"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.522668 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.522721 4735 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/66ee97cc-ac56-4879-b605-e2a9347213ca-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.522743 4735 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.522761 4735 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66ee97cc-ac56-4879-b605-e2a9347213ca-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.522812 4735 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.522829 4735 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66ee97cc-ac56-4879-b605-e2a9347213ca-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.550456 4735 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 01 11:13:24 crc kubenswrapper[4735]: I1001 11:13:24.624821 4735 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 01 11:13:25 crc kubenswrapper[4735]: I1001 11:13:25.082347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"66ee97cc-ac56-4879-b605-e2a9347213ca","Type":"ContainerDied","Data":"52d3827b8ebb42159d843dc0cb7d4ac02ab4e0373f52d8b083c88ca63459449e"} Oct 01 11:13:25 crc kubenswrapper[4735]: I1001 11:13:25.082691 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52d3827b8ebb42159d843dc0cb7d4ac02ab4e0373f52d8b083c88ca63459449e" Oct 01 11:13:25 crc kubenswrapper[4735]: I1001 11:13:25.082440 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 11:13:29 crc kubenswrapper[4735]: I1001 11:13:29.943299 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 11:13:29 crc kubenswrapper[4735]: E1001 11:13:29.944261 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06232b3f-0241-45ed-9267-c41302d3ce83" containerName="extract-utilities" Oct 01 11:13:29 crc kubenswrapper[4735]: I1001 11:13:29.944295 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="06232b3f-0241-45ed-9267-c41302d3ce83" containerName="extract-utilities" Oct 01 11:13:29 crc kubenswrapper[4735]: E1001 11:13:29.944312 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06232b3f-0241-45ed-9267-c41302d3ce83" containerName="registry-server" Oct 01 11:13:29 crc kubenswrapper[4735]: I1001 11:13:29.944318 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="06232b3f-0241-45ed-9267-c41302d3ce83" containerName="registry-server" Oct 01 11:13:29 crc kubenswrapper[4735]: E1001 11:13:29.944333 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ee97cc-ac56-4879-b605-e2a9347213ca" containerName="tempest-tests-tempest-tests-runner" Oct 01 11:13:29 crc kubenswrapper[4735]: I1001 11:13:29.944340 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ee97cc-ac56-4879-b605-e2a9347213ca" containerName="tempest-tests-tempest-tests-runner" Oct 01 11:13:29 crc kubenswrapper[4735]: E1001 11:13:29.944377 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06232b3f-0241-45ed-9267-c41302d3ce83" containerName="extract-content" Oct 01 11:13:29 crc kubenswrapper[4735]: I1001 11:13:29.944383 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="06232b3f-0241-45ed-9267-c41302d3ce83" containerName="extract-content" Oct 01 11:13:29 crc kubenswrapper[4735]: I1001 11:13:29.944612 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ee97cc-ac56-4879-b605-e2a9347213ca" containerName="tempest-tests-tempest-tests-runner" Oct 01 11:13:29 crc kubenswrapper[4735]: I1001 11:13:29.944627 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="06232b3f-0241-45ed-9267-c41302d3ce83" containerName="registry-server" Oct 01 11:13:29 crc kubenswrapper[4735]: I1001 11:13:29.945250 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 11:13:29 crc kubenswrapper[4735]: I1001 11:13:29.950830 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dlsqd" Oct 01 11:13:29 crc kubenswrapper[4735]: I1001 11:13:29.959157 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 11:13:30 crc kubenswrapper[4735]: I1001 11:13:30.076081 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6001430a-4111-47f2-ba18-ee2e2661bb7c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 11:13:30 crc kubenswrapper[4735]: I1001 11:13:30.076404 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsf9v\" (UniqueName: \"kubernetes.io/projected/6001430a-4111-47f2-ba18-ee2e2661bb7c-kube-api-access-jsf9v\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6001430a-4111-47f2-ba18-ee2e2661bb7c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 11:13:30 crc kubenswrapper[4735]: I1001 11:13:30.178331 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsf9v\" (UniqueName: \"kubernetes.io/projected/6001430a-4111-47f2-ba18-ee2e2661bb7c-kube-api-access-jsf9v\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6001430a-4111-47f2-ba18-ee2e2661bb7c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 11:13:30 crc kubenswrapper[4735]: I1001 11:13:30.178638 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6001430a-4111-47f2-ba18-ee2e2661bb7c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 11:13:30 crc kubenswrapper[4735]: I1001 11:13:30.179186 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6001430a-4111-47f2-ba18-ee2e2661bb7c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 11:13:30 crc kubenswrapper[4735]: I1001 11:13:30.204036 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsf9v\" (UniqueName: \"kubernetes.io/projected/6001430a-4111-47f2-ba18-ee2e2661bb7c-kube-api-access-jsf9v\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6001430a-4111-47f2-ba18-ee2e2661bb7c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 11:13:30 crc kubenswrapper[4735]: I1001 11:13:30.236163 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6001430a-4111-47f2-ba18-ee2e2661bb7c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 11:13:30 crc kubenswrapper[4735]: I1001 11:13:30.276189 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 11:13:30 crc kubenswrapper[4735]: I1001 11:13:30.801408 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 11:13:31 crc kubenswrapper[4735]: I1001 11:13:31.166267 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6001430a-4111-47f2-ba18-ee2e2661bb7c","Type":"ContainerStarted","Data":"4e260fe45a552b4f29302d4fdbc45cb517111174fac6018fe0a1d014d9a622fb"} Oct 01 11:13:32 crc kubenswrapper[4735]: I1001 11:13:32.175851 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6001430a-4111-47f2-ba18-ee2e2661bb7c","Type":"ContainerStarted","Data":"06e0c2ccd99be3de329053f9372a61712f268be16f70b4711cb64708d0b216a6"} Oct 01 11:13:32 crc kubenswrapper[4735]: I1001 11:13:32.191995 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.229895438 podStartE2EDuration="3.191974008s" podCreationTimestamp="2025-10-01 11:13:29 +0000 UTC" firstStartedPulling="2025-10-01 11:13:30.81084551 +0000 UTC m=+3369.503666772" lastFinishedPulling="2025-10-01 11:13:31.77292406 +0000 UTC m=+3370.465745342" observedRunningTime="2025-10-01 11:13:32.186713448 +0000 UTC m=+3370.879534730" watchObservedRunningTime="2025-10-01 11:13:32.191974008 +0000 UTC m=+3370.884795270" Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.263554 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pkp6p/must-gather-6s4pg"] Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.265369 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/must-gather-6s4pg" Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.267326 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pkp6p"/"openshift-service-ca.crt" Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.270394 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pkp6p"/"kube-root-ca.crt" Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.329702 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pkp6p/must-gather-6s4pg"] Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.422189 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6a5fa64-2376-48c5-9199-b25978c0cd0e-must-gather-output\") pod \"must-gather-6s4pg\" (UID: \"a6a5fa64-2376-48c5-9199-b25978c0cd0e\") " pod="openshift-must-gather-pkp6p/must-gather-6s4pg" Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.422243 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6chw\" (UniqueName: \"kubernetes.io/projected/a6a5fa64-2376-48c5-9199-b25978c0cd0e-kube-api-access-q6chw\") pod \"must-gather-6s4pg\" (UID: \"a6a5fa64-2376-48c5-9199-b25978c0cd0e\") " pod="openshift-must-gather-pkp6p/must-gather-6s4pg" Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.524261 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6a5fa64-2376-48c5-9199-b25978c0cd0e-must-gather-output\") pod \"must-gather-6s4pg\" (UID: \"a6a5fa64-2376-48c5-9199-b25978c0cd0e\") " pod="openshift-must-gather-pkp6p/must-gather-6s4pg" Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.524311 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6chw\" (UniqueName: \"kubernetes.io/projected/a6a5fa64-2376-48c5-9199-b25978c0cd0e-kube-api-access-q6chw\") pod \"must-gather-6s4pg\" (UID: \"a6a5fa64-2376-48c5-9199-b25978c0cd0e\") " pod="openshift-must-gather-pkp6p/must-gather-6s4pg" Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.524729 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6a5fa64-2376-48c5-9199-b25978c0cd0e-must-gather-output\") pod \"must-gather-6s4pg\" (UID: \"a6a5fa64-2376-48c5-9199-b25978c0cd0e\") " pod="openshift-must-gather-pkp6p/must-gather-6s4pg" Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.543275 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6chw\" (UniqueName: \"kubernetes.io/projected/a6a5fa64-2376-48c5-9199-b25978c0cd0e-kube-api-access-q6chw\") pod \"must-gather-6s4pg\" (UID: \"a6a5fa64-2376-48c5-9199-b25978c0cd0e\") " pod="openshift-must-gather-pkp6p/must-gather-6s4pg" Oct 01 11:13:49 crc kubenswrapper[4735]: I1001 11:13:49.582972 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/must-gather-6s4pg" Oct 01 11:13:50 crc kubenswrapper[4735]: I1001 11:13:50.013338 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pkp6p/must-gather-6s4pg"] Oct 01 11:13:50 crc kubenswrapper[4735]: W1001 11:13:50.020224 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6a5fa64_2376_48c5_9199_b25978c0cd0e.slice/crio-1bcdccdc81a8c6af86420816212cc4370ca22415513c1c252fe87beb3899f308 WatchSource:0}: Error finding container 1bcdccdc81a8c6af86420816212cc4370ca22415513c1c252fe87beb3899f308: Status 404 returned error can't find the container with id 1bcdccdc81a8c6af86420816212cc4370ca22415513c1c252fe87beb3899f308 Oct 01 11:13:50 crc kubenswrapper[4735]: I1001 11:13:50.373534 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/must-gather-6s4pg" event={"ID":"a6a5fa64-2376-48c5-9199-b25978c0cd0e","Type":"ContainerStarted","Data":"1bcdccdc81a8c6af86420816212cc4370ca22415513c1c252fe87beb3899f308"} Oct 01 11:13:54 crc kubenswrapper[4735]: I1001 11:13:54.414032 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/must-gather-6s4pg" event={"ID":"a6a5fa64-2376-48c5-9199-b25978c0cd0e","Type":"ContainerStarted","Data":"b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38"} Oct 01 11:13:54 crc kubenswrapper[4735]: I1001 11:13:54.414657 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/must-gather-6s4pg" event={"ID":"a6a5fa64-2376-48c5-9199-b25978c0cd0e","Type":"ContainerStarted","Data":"bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a"} Oct 01 11:13:57 crc kubenswrapper[4735]: I1001 11:13:57.610458 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pkp6p/must-gather-6s4pg" podStartSLOduration=4.849297475 podStartE2EDuration="8.610436579s" podCreationTimestamp="2025-10-01 11:13:49 +0000 UTC" firstStartedPulling="2025-10-01 11:13:50.023086975 +0000 UTC m=+3388.715908247" lastFinishedPulling="2025-10-01 11:13:53.784226089 +0000 UTC m=+3392.477047351" observedRunningTime="2025-10-01 11:13:54.431563669 +0000 UTC m=+3393.124384981" watchObservedRunningTime="2025-10-01 11:13:57.610436579 +0000 UTC m=+3396.303257841" Oct 01 11:13:57 crc kubenswrapper[4735]: I1001 11:13:57.617371 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pkp6p/crc-debug-28cnw"] Oct 01 11:13:57 crc kubenswrapper[4735]: I1001 11:13:57.618852 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-28cnw" Oct 01 11:13:57 crc kubenswrapper[4735]: I1001 11:13:57.622680 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pkp6p"/"default-dockercfg-rzrgf" Oct 01 11:13:57 crc kubenswrapper[4735]: I1001 11:13:57.682743 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m7sj\" (UniqueName: \"kubernetes.io/projected/00ab2086-d717-427a-ba6e-8d4ad379150d-kube-api-access-7m7sj\") pod \"crc-debug-28cnw\" (UID: \"00ab2086-d717-427a-ba6e-8d4ad379150d\") " pod="openshift-must-gather-pkp6p/crc-debug-28cnw" Oct 01 11:13:57 crc kubenswrapper[4735]: I1001 11:13:57.682901 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00ab2086-d717-427a-ba6e-8d4ad379150d-host\") pod \"crc-debug-28cnw\" (UID: \"00ab2086-d717-427a-ba6e-8d4ad379150d\") " pod="openshift-must-gather-pkp6p/crc-debug-28cnw" Oct 01 11:13:57 crc kubenswrapper[4735]: I1001 11:13:57.784887 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00ab2086-d717-427a-ba6e-8d4ad379150d-host\") pod \"crc-debug-28cnw\" (UID: \"00ab2086-d717-427a-ba6e-8d4ad379150d\") " pod="openshift-must-gather-pkp6p/crc-debug-28cnw" Oct 01 11:13:57 crc kubenswrapper[4735]: I1001 11:13:57.785042 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00ab2086-d717-427a-ba6e-8d4ad379150d-host\") pod \"crc-debug-28cnw\" (UID: \"00ab2086-d717-427a-ba6e-8d4ad379150d\") " pod="openshift-must-gather-pkp6p/crc-debug-28cnw" Oct 01 11:13:57 crc kubenswrapper[4735]: I1001 11:13:57.785076 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7sj\" (UniqueName: \"kubernetes.io/projected/00ab2086-d717-427a-ba6e-8d4ad379150d-kube-api-access-7m7sj\") pod \"crc-debug-28cnw\" (UID: \"00ab2086-d717-427a-ba6e-8d4ad379150d\") " pod="openshift-must-gather-pkp6p/crc-debug-28cnw" Oct 01 11:13:57 crc kubenswrapper[4735]: I1001 11:13:57.806976 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m7sj\" (UniqueName: \"kubernetes.io/projected/00ab2086-d717-427a-ba6e-8d4ad379150d-kube-api-access-7m7sj\") pod \"crc-debug-28cnw\" (UID: \"00ab2086-d717-427a-ba6e-8d4ad379150d\") " pod="openshift-must-gather-pkp6p/crc-debug-28cnw" Oct 01 11:13:57 crc kubenswrapper[4735]: I1001 11:13:57.936021 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-28cnw" Oct 01 11:13:57 crc kubenswrapper[4735]: W1001 11:13:57.973661 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00ab2086_d717_427a_ba6e_8d4ad379150d.slice/crio-903ce35f01f887ec2a2a26a27deb1d942a621921bf4a60d3d1fbc9392da1319e WatchSource:0}: Error finding container 903ce35f01f887ec2a2a26a27deb1d942a621921bf4a60d3d1fbc9392da1319e: Status 404 returned error can't find the container with id 903ce35f01f887ec2a2a26a27deb1d942a621921bf4a60d3d1fbc9392da1319e Oct 01 11:13:58 crc kubenswrapper[4735]: I1001 11:13:58.471879 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/crc-debug-28cnw" event={"ID":"00ab2086-d717-427a-ba6e-8d4ad379150d","Type":"ContainerStarted","Data":"903ce35f01f887ec2a2a26a27deb1d942a621921bf4a60d3d1fbc9392da1319e"} Oct 01 11:14:05 crc kubenswrapper[4735]: I1001 11:14:05.485670 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:14:05 crc kubenswrapper[4735]: I1001 11:14:05.486225 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:14:08 crc kubenswrapper[4735]: I1001 11:14:08.574346 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/crc-debug-28cnw" event={"ID":"00ab2086-d717-427a-ba6e-8d4ad379150d","Type":"ContainerStarted","Data":"24b50f722b52bb6c43cffa275d301c452ab565b7d5b9376663723761988fa0b8"} Oct 01 11:14:08 crc kubenswrapper[4735]: I1001 11:14:08.593218 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pkp6p/crc-debug-28cnw" podStartSLOduration=1.32572433 podStartE2EDuration="11.593186059s" podCreationTimestamp="2025-10-01 11:13:57 +0000 UTC" firstStartedPulling="2025-10-01 11:13:57.975793546 +0000 UTC m=+3396.668614808" lastFinishedPulling="2025-10-01 11:14:08.243255275 +0000 UTC m=+3406.936076537" observedRunningTime="2025-10-01 11:14:08.591744231 +0000 UTC m=+3407.284565533" watchObservedRunningTime="2025-10-01 11:14:08.593186059 +0000 UTC m=+3407.286007361" Oct 01 11:14:35 crc kubenswrapper[4735]: I1001 11:14:35.485756 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:14:35 crc kubenswrapper[4735]: I1001 11:14:35.486618 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:14:58 crc kubenswrapper[4735]: I1001 11:14:58.339339 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7ddbc654c4-6bddt_e4554523-3ac0-4f1d-8a5d-f8892d72229d/barbican-api/0.log" Oct 01 11:14:58 crc kubenswrapper[4735]: I1001 11:14:58.458089 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7ddbc654c4-6bddt_e4554523-3ac0-4f1d-8a5d-f8892d72229d/barbican-api-log/0.log" Oct 01 11:14:58 crc kubenswrapper[4735]: I1001 11:14:58.540818 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-599c96b4d8-46mbs_4ce28744-bfa2-4674-a495-02abf8245d38/barbican-keystone-listener/0.log" Oct 01 11:14:58 crc kubenswrapper[4735]: I1001 11:14:58.707882 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-599c96b4d8-46mbs_4ce28744-bfa2-4674-a495-02abf8245d38/barbican-keystone-listener-log/0.log" Oct 01 11:14:58 crc kubenswrapper[4735]: I1001 11:14:58.840163 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55fd7c674f-kfd9n_39f45bdc-e25e-41f8-aefe-0dede18c4bb8/barbican-worker/0.log" Oct 01 11:14:58 crc kubenswrapper[4735]: I1001 11:14:58.950615 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55fd7c674f-kfd9n_39f45bdc-e25e-41f8-aefe-0dede18c4bb8/barbican-worker-log/0.log" Oct 01 11:14:59 crc kubenswrapper[4735]: I1001 11:14:59.105082 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq_ffacc50e-734b-4e8a-ac0c-a33197ce2351/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:14:59 crc kubenswrapper[4735]: I1001 11:14:59.238563 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_181bb1ae-a923-4bbd-8d8f-e5d8c8878214/ceilometer-central-agent/0.log" Oct 01 11:14:59 crc kubenswrapper[4735]: I1001 11:14:59.358937 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_181bb1ae-a923-4bbd-8d8f-e5d8c8878214/proxy-httpd/0.log" Oct 01 11:14:59 crc kubenswrapper[4735]: I1001 11:14:59.380090 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_181bb1ae-a923-4bbd-8d8f-e5d8c8878214/ceilometer-notification-agent/0.log" Oct 01 11:14:59 crc kubenswrapper[4735]: I1001 11:14:59.447558 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_181bb1ae-a923-4bbd-8d8f-e5d8c8878214/sg-core/0.log" Oct 01 11:14:59 crc kubenswrapper[4735]: I1001 11:14:59.637915 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a00c66d0-b381-43e2-ae45-8635ff4f424e/cinder-api/0.log" Oct 01 11:14:59 crc kubenswrapper[4735]: I1001 11:14:59.671027 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a00c66d0-b381-43e2-ae45-8635ff4f424e/cinder-api-log/0.log" Oct 01 11:14:59 crc kubenswrapper[4735]: I1001 11:14:59.886558 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c7925532-6ff1-4914-84b0-8206d0ad5225/cinder-scheduler/0.log" Oct 01 11:14:59 crc kubenswrapper[4735]: I1001 11:14:59.903260 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c7925532-6ff1-4914-84b0-8206d0ad5225/probe/0.log" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.077668 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b_697ab5a4-56ef-4755-9211-fcd52866c939/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.173526 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv"] Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.174715 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.176683 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.178928 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.185074 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv"] Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.286079 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b55d28b6-1a2f-443d-a598-203423486fc3-secret-volume\") pod \"collect-profiles-29321955-vp8pv\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.286448 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b55d28b6-1a2f-443d-a598-203423486fc3-config-volume\") pod \"collect-profiles-29321955-vp8pv\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.286593 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vgz6\" (UniqueName: \"kubernetes.io/projected/b55d28b6-1a2f-443d-a598-203423486fc3-kube-api-access-5vgz6\") pod \"collect-profiles-29321955-vp8pv\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.329462 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt_36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.356805 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-svdrq_5b381c11-584e-4a17-b4a4-cd150f2d3d82/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.388434 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b55d28b6-1a2f-443d-a598-203423486fc3-config-volume\") pod \"collect-profiles-29321955-vp8pv\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.388576 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vgz6\" (UniqueName: \"kubernetes.io/projected/b55d28b6-1a2f-443d-a598-203423486fc3-kube-api-access-5vgz6\") pod \"collect-profiles-29321955-vp8pv\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.388652 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b55d28b6-1a2f-443d-a598-203423486fc3-secret-volume\") pod \"collect-profiles-29321955-vp8pv\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.390320 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b55d28b6-1a2f-443d-a598-203423486fc3-config-volume\") pod \"collect-profiles-29321955-vp8pv\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.400550 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b55d28b6-1a2f-443d-a598-203423486fc3-secret-volume\") pod \"collect-profiles-29321955-vp8pv\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.418249 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vgz6\" (UniqueName: \"kubernetes.io/projected/b55d28b6-1a2f-443d-a598-203423486fc3-kube-api-access-5vgz6\") pod \"collect-profiles-29321955-vp8pv\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.507038 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.631022 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-4vfgs_9b994d24-224b-42cf-8516-044c561a5f4e/init/0.log" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.881611 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-4vfgs_9b994d24-224b-42cf-8516-044c561a5f4e/init/0.log" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.897428 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-4vfgs_9b994d24-224b-42cf-8516-044c561a5f4e/dnsmasq-dns/0.log" Oct 01 11:15:00 crc kubenswrapper[4735]: I1001 11:15:00.995686 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv"] Oct 01 11:15:01 crc kubenswrapper[4735]: I1001 11:15:01.029849 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" event={"ID":"b55d28b6-1a2f-443d-a598-203423486fc3","Type":"ContainerStarted","Data":"8d0ba3d92f3eb0914d0c69a613bb8ef3c61dfe330c4102261207653c651abe22"} Oct 01 11:15:01 crc kubenswrapper[4735]: I1001 11:15:01.150417 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5_31970fe6-5cef-41cc-8799-c4e9de559f23/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:01 crc kubenswrapper[4735]: I1001 11:15:01.156150 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_53b4ae38-a993-4a83-93bf-da796d4be856/glance-httpd/0.log" Oct 01 11:15:01 crc kubenswrapper[4735]: I1001 11:15:01.375344 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_53b4ae38-a993-4a83-93bf-da796d4be856/glance-log/0.log" Oct 01 11:15:01 crc kubenswrapper[4735]: I1001 11:15:01.452200 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_971696de-af05-4d40-89d1-64a2688b08e0/glance-httpd/0.log" Oct 01 11:15:01 crc kubenswrapper[4735]: I1001 11:15:01.463206 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_971696de-af05-4d40-89d1-64a2688b08e0/glance-log/0.log" Oct 01 11:15:01 crc kubenswrapper[4735]: I1001 11:15:01.875857 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7746dbdbf6-t6f7n_7353c4ca-59bc-4a50-8840-8365f90f6384/horizon/0.log" Oct 01 11:15:02 crc kubenswrapper[4735]: I1001 11:15:02.033184 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z_77d6c2ca-cb77-4582-b644-d077086a29b5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:02 crc kubenswrapper[4735]: I1001 11:15:02.039988 4735 generic.go:334] "Generic (PLEG): container finished" podID="b55d28b6-1a2f-443d-a598-203423486fc3" containerID="0eb0c3266b2b28fd613ab22c1caf6948460845dbee5dedf5b2e171df9ec176b5" exitCode=0 Oct 01 11:15:02 crc kubenswrapper[4735]: I1001 11:15:02.040025 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" event={"ID":"b55d28b6-1a2f-443d-a598-203423486fc3","Type":"ContainerDied","Data":"0eb0c3266b2b28fd613ab22c1caf6948460845dbee5dedf5b2e171df9ec176b5"} Oct 01 11:15:02 crc kubenswrapper[4735]: I1001 11:15:02.107122 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-cd2rs_569a20cc-087a-4c93-b23b-af5c6b209b80/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:02 crc kubenswrapper[4735]: I1001 11:15:02.187690 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7746dbdbf6-t6f7n_7353c4ca-59bc-4a50-8840-8365f90f6384/horizon-log/0.log" Oct 01 11:15:02 crc kubenswrapper[4735]: I1001 11:15:02.394608 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29321941-p4wv8_99c1d24d-af2e-4093-bd99-d1c1cdabd8be/keystone-cron/0.log" Oct 01 11:15:02 crc kubenswrapper[4735]: I1001 11:15:02.427540 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7888d7549b-dbpkr_98103d81-4a3c-4c99-9d51-f73f8e5fd295/keystone-api/0.log" Oct 01 11:15:02 crc kubenswrapper[4735]: I1001 11:15:02.560999 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f9f14b82-f708-409b-94cb-34b6863dc8cc/kube-state-metrics/0.log" Oct 01 11:15:02 crc kubenswrapper[4735]: I1001 11:15:02.638054 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs_98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:02 crc kubenswrapper[4735]: I1001 11:15:02.960916 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-798b4f9b87-frx5r_d235163d-548f-40e3-9aae-490a41523da2/neutron-api/0.log" Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.024728 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-798b4f9b87-frx5r_d235163d-548f-40e3-9aae-490a41523da2/neutron-httpd/0.log" Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.159292 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw_9d75fc81-9126-4b8e-b623-47a8c65adb8f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.387140 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.440566 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b55d28b6-1a2f-443d-a598-203423486fc3-secret-volume\") pod \"b55d28b6-1a2f-443d-a598-203423486fc3\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.440670 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vgz6\" (UniqueName: \"kubernetes.io/projected/b55d28b6-1a2f-443d-a598-203423486fc3-kube-api-access-5vgz6\") pod \"b55d28b6-1a2f-443d-a598-203423486fc3\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.440815 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b55d28b6-1a2f-443d-a598-203423486fc3-config-volume\") pod \"b55d28b6-1a2f-443d-a598-203423486fc3\" (UID: \"b55d28b6-1a2f-443d-a598-203423486fc3\") " Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.441833 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b55d28b6-1a2f-443d-a598-203423486fc3-config-volume" (OuterVolumeSpecName: "config-volume") pod "b55d28b6-1a2f-443d-a598-203423486fc3" (UID: "b55d28b6-1a2f-443d-a598-203423486fc3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.452250 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b55d28b6-1a2f-443d-a598-203423486fc3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b55d28b6-1a2f-443d-a598-203423486fc3" (UID: "b55d28b6-1a2f-443d-a598-203423486fc3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.452424 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b55d28b6-1a2f-443d-a598-203423486fc3-kube-api-access-5vgz6" (OuterVolumeSpecName: "kube-api-access-5vgz6") pod "b55d28b6-1a2f-443d-a598-203423486fc3" (UID: "b55d28b6-1a2f-443d-a598-203423486fc3"). InnerVolumeSpecName "kube-api-access-5vgz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.544879 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b55d28b6-1a2f-443d-a598-203423486fc3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.544920 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b55d28b6-1a2f-443d-a598-203423486fc3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.544934 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vgz6\" (UniqueName: \"kubernetes.io/projected/b55d28b6-1a2f-443d-a598-203423486fc3-kube-api-access-5vgz6\") on node \"crc\" DevicePath \"\"" Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.813941 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_62b96f55-0807-4922-acaf-a84037e549ff/nova-api-log/0.log" Oct 01 11:15:03 crc kubenswrapper[4735]: I1001 11:15:03.878539 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_61a477e3-4a22-4c96-bfdb-c72c65d4984c/nova-cell0-conductor-conductor/0.log" Oct 01 11:15:04 crc kubenswrapper[4735]: I1001 11:15:04.040827 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_62b96f55-0807-4922-acaf-a84037e549ff/nova-api-api/0.log" Oct 01 11:15:04 crc kubenswrapper[4735]: I1001 11:15:04.061053 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" event={"ID":"b55d28b6-1a2f-443d-a598-203423486fc3","Type":"ContainerDied","Data":"8d0ba3d92f3eb0914d0c69a613bb8ef3c61dfe330c4102261207653c651abe22"} Oct 01 11:15:04 crc kubenswrapper[4735]: I1001 11:15:04.061090 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d0ba3d92f3eb0914d0c69a613bb8ef3c61dfe330c4102261207653c651abe22" Oct 01 11:15:04 crc kubenswrapper[4735]: I1001 11:15:04.061144 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321955-vp8pv" Oct 01 11:15:04 crc kubenswrapper[4735]: I1001 11:15:04.229810 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_bf5a2979-222f-459d-9c57-599ebc27167e/nova-cell1-conductor-conductor/0.log" Oct 01 11:15:04 crc kubenswrapper[4735]: I1001 11:15:04.369879 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8639e0ae-f968-4b8f-b73d-52c2aba0ad24/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 11:15:04 crc kubenswrapper[4735]: I1001 11:15:04.453982 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj"] Oct 01 11:15:04 crc kubenswrapper[4735]: I1001 11:15:04.460885 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321910-74whj"] Oct 01 11:15:04 crc kubenswrapper[4735]: I1001 11:15:04.559354 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-m7tr8_83863343-c31f-484c-9e44-3e6ed41988d8/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:04 crc kubenswrapper[4735]: I1001 11:15:04.768554 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_79b8eab7-e3a4-4194-852d-1f1b91155a7d/nova-metadata-log/0.log" Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.152196 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6ce10bc0-35a7-49e7-b138-196478a093d0/nova-scheduler-scheduler/0.log" Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.310032 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ec099172-9672-4553-94cd-c430818da51d/mysql-bootstrap/0.log" Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.485366 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.485412 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.485448 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.485886 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c0fff2ad8c11c90072972b08d9ebb58c252453d2fef2077d87db58eb0f716d0"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.485936 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://7c0fff2ad8c11c90072972b08d9ebb58c252453d2fef2077d87db58eb0f716d0" gracePeriod=600 Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.576561 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ec099172-9672-4553-94cd-c430818da51d/galera/0.log" Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.577783 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ec099172-9672-4553-94cd-c430818da51d/mysql-bootstrap/0.log" Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.835031 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_15f57822-7418-47e6-b679-aea87612b3ec/mysql-bootstrap/0.log" Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.908811 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="692a51ac-dd61-47a9-ac24-64f95d1cb6d1" path="/var/lib/kubelet/pods/692a51ac-dd61-47a9-ac24-64f95d1cb6d1/volumes" Oct 01 11:15:05 crc kubenswrapper[4735]: I1001 11:15:05.995969 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_79b8eab7-e3a4-4194-852d-1f1b91155a7d/nova-metadata-metadata/0.log" Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.007606 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_15f57822-7418-47e6-b679-aea87612b3ec/mysql-bootstrap/0.log" Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.041189 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_15f57822-7418-47e6-b679-aea87612b3ec/galera/0.log" Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.077214 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="7c0fff2ad8c11c90072972b08d9ebb58c252453d2fef2077d87db58eb0f716d0" exitCode=0 Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.077257 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"7c0fff2ad8c11c90072972b08d9ebb58c252453d2fef2077d87db58eb0f716d0"} Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.077282 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53"} Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.077302 4735 scope.go:117] "RemoveContainer" containerID="a4914eda549785b6e3091537df03e9f39ab7844172c26d202c4f0e5020c25e24" Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.261875 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5/openstackclient/0.log" Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.524025 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4vkk2_83a05602-9f43-41cf-af06-9e6d2109e6c9/openstack-network-exporter/0.log" Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.604174 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4k7wp_43ce7edf-2010-4ccf-ac60-b26606130624/ovsdb-server-init/0.log" Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.763093 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4k7wp_43ce7edf-2010-4ccf-ac60-b26606130624/ovsdb-server-init/0.log" Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.800101 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4k7wp_43ce7edf-2010-4ccf-ac60-b26606130624/ovsdb-server/0.log" Oct 01 11:15:06 crc kubenswrapper[4735]: I1001 11:15:06.802580 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4k7wp_43ce7edf-2010-4ccf-ac60-b26606130624/ovs-vswitchd/0.log" Oct 01 11:15:07 crc kubenswrapper[4735]: I1001 11:15:07.074045 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zmn7b_3bcc7869-f6b2-4c99-adde-40577b12c99d/ovn-controller/0.log" Oct 01 11:15:07 crc kubenswrapper[4735]: I1001 11:15:07.292763 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-srqk9_d5532968-8896-44ba-a120-62bacb3bf10a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:07 crc kubenswrapper[4735]: I1001 11:15:07.316821 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_356e4644-7f54-4b34-b72a-510958be19e5/openstack-network-exporter/0.log" Oct 01 11:15:07 crc kubenswrapper[4735]: I1001 11:15:07.512826 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_356e4644-7f54-4b34-b72a-510958be19e5/ovn-northd/0.log" Oct 01 11:15:07 crc kubenswrapper[4735]: I1001 11:15:07.582667 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_33a18104-7c91-4314-99e8-37396ef7c259/openstack-network-exporter/0.log" Oct 01 11:15:07 crc kubenswrapper[4735]: I1001 11:15:07.700965 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_33a18104-7c91-4314-99e8-37396ef7c259/ovsdbserver-nb/0.log" Oct 01 11:15:07 crc kubenswrapper[4735]: I1001 11:15:07.791715 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_aa211c9f-0ea4-4b46-9f37-c5917dd0d833/openstack-network-exporter/0.log" Oct 01 11:15:07 crc kubenswrapper[4735]: I1001 11:15:07.909428 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_aa211c9f-0ea4-4b46-9f37-c5917dd0d833/ovsdbserver-sb/0.log" Oct 01 11:15:08 crc kubenswrapper[4735]: I1001 11:15:08.081489 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85c495dfd8-k8nc5_5542515c-c850-46f5-875b-65c55c28cbdc/placement-api/0.log" Oct 01 11:15:08 crc kubenswrapper[4735]: I1001 11:15:08.210745 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85c495dfd8-k8nc5_5542515c-c850-46f5-875b-65c55c28cbdc/placement-log/0.log" Oct 01 11:15:08 crc kubenswrapper[4735]: I1001 11:15:08.302936 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cc51941b-ed03-480a-a90b-ba40dec75a6c/setup-container/0.log" Oct 01 11:15:08 crc kubenswrapper[4735]: I1001 11:15:08.510503 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cc51941b-ed03-480a-a90b-ba40dec75a6c/setup-container/0.log" Oct 01 11:15:08 crc kubenswrapper[4735]: I1001 11:15:08.515401 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cc51941b-ed03-480a-a90b-ba40dec75a6c/rabbitmq/0.log" Oct 01 11:15:08 crc kubenswrapper[4735]: I1001 11:15:08.701433 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_48a5abfa-1c13-4130-8cad-1596c95ef581/setup-container/0.log" Oct 01 11:15:08 crc kubenswrapper[4735]: I1001 11:15:08.874196 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_48a5abfa-1c13-4130-8cad-1596c95ef581/setup-container/0.log" Oct 01 11:15:08 crc kubenswrapper[4735]: I1001 11:15:08.918388 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_48a5abfa-1c13-4130-8cad-1596c95ef581/rabbitmq/0.log" Oct 01 11:15:09 crc kubenswrapper[4735]: I1001 11:15:09.078768 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8_fa85eebe-cbdf-41f6-b47b-5e844222f3fe/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:09 crc kubenswrapper[4735]: I1001 11:15:09.143910 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ldfrh_f3b0231f-c3ff-46dd-869c-c36d62466f45/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:09 crc kubenswrapper[4735]: I1001 11:15:09.348946 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs_7847fa7b-0680-48a0-bbba-2adf6b14fcec/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:09 crc kubenswrapper[4735]: I1001 11:15:09.531550 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-c8mbg_7223ea14-1a97-4c77-bbab-5f2919606539/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:09 crc kubenswrapper[4735]: I1001 11:15:09.633808 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dpr46_57690d02-83f0-47e6-a66f-da0ab4138820/ssh-known-hosts-edpm-deployment/0.log" Oct 01 11:15:09 crc kubenswrapper[4735]: I1001 11:15:09.870233 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-696cd688cf-kqrbf_95643272-0db0-4c04-9087-98321b57c893/proxy-server/0.log" Oct 01 11:15:09 crc kubenswrapper[4735]: I1001 11:15:09.902363 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-696cd688cf-kqrbf_95643272-0db0-4c04-9087-98321b57c893/proxy-httpd/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.118825 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-8gpx2_b5dd8db9-b427-4510-b6ab-82883a128fa2/swift-ring-rebalance/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.221984 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/account-auditor/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.289743 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/account-reaper/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.384083 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/account-replicator/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.423827 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/account-server/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.478019 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/container-auditor/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.615311 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/container-server/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.628216 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/container-replicator/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.687811 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/container-updater/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.852468 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/object-auditor/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.890430 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/object-expirer/0.log" Oct 01 11:15:10 crc kubenswrapper[4735]: I1001 11:15:10.914009 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/object-replicator/0.log" Oct 01 11:15:11 crc kubenswrapper[4735]: I1001 11:15:11.007151 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/object-server/0.log" Oct 01 11:15:11 crc kubenswrapper[4735]: I1001 11:15:11.060645 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/object-updater/0.log" Oct 01 11:15:11 crc kubenswrapper[4735]: I1001 11:15:11.119965 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/rsync/0.log" Oct 01 11:15:11 crc kubenswrapper[4735]: I1001 11:15:11.199247 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/swift-recon-cron/0.log" Oct 01 11:15:11 crc kubenswrapper[4735]: I1001 11:15:11.374089 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq_061a2955-62b7-47d5-b62c-abb147006933/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:11 crc kubenswrapper[4735]: I1001 11:15:11.519470 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_66ee97cc-ac56-4879-b605-e2a9347213ca/tempest-tests-tempest-tests-runner/0.log" Oct 01 11:15:11 crc kubenswrapper[4735]: I1001 11:15:11.695235 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6001430a-4111-47f2-ba18-ee2e2661bb7c/test-operator-logs-container/0.log" Oct 01 11:15:11 crc kubenswrapper[4735]: I1001 11:15:11.866314 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9_02bd9618-e194-4d1b-98f5-90ab53e53e39/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:15:20 crc kubenswrapper[4735]: I1001 11:15:20.211068 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_35686057-f8f4-4ef2-8a22-b9de9c15c9e5/memcached/0.log" Oct 01 11:15:27 crc kubenswrapper[4735]: I1001 11:15:27.719352 4735 scope.go:117] "RemoveContainer" containerID="36450a9da66059fd1c7c7b8f6925bcae76f3eb7872468f1bbffe5c54dc67a179" Oct 01 11:16:01 crc kubenswrapper[4735]: E1001 11:16:01.495734 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00ab2086_d717_427a_ba6e_8d4ad379150d.slice/crio-24b50f722b52bb6c43cffa275d301c452ab565b7d5b9376663723761988fa0b8.scope\": RecentStats: unable to find data in memory cache]" Oct 01 11:16:01 crc kubenswrapper[4735]: I1001 11:16:01.621921 4735 generic.go:334] "Generic (PLEG): container finished" podID="00ab2086-d717-427a-ba6e-8d4ad379150d" containerID="24b50f722b52bb6c43cffa275d301c452ab565b7d5b9376663723761988fa0b8" exitCode=0 Oct 01 11:16:01 crc kubenswrapper[4735]: I1001 11:16:01.621977 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/crc-debug-28cnw" event={"ID":"00ab2086-d717-427a-ba6e-8d4ad379150d","Type":"ContainerDied","Data":"24b50f722b52bb6c43cffa275d301c452ab565b7d5b9376663723761988fa0b8"} Oct 01 11:16:02 crc kubenswrapper[4735]: I1001 11:16:02.739885 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-28cnw" Oct 01 11:16:02 crc kubenswrapper[4735]: I1001 11:16:02.784964 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pkp6p/crc-debug-28cnw"] Oct 01 11:16:02 crc kubenswrapper[4735]: I1001 11:16:02.793724 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pkp6p/crc-debug-28cnw"] Oct 01 11:16:02 crc kubenswrapper[4735]: I1001 11:16:02.815060 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00ab2086-d717-427a-ba6e-8d4ad379150d-host\") pod \"00ab2086-d717-427a-ba6e-8d4ad379150d\" (UID: \"00ab2086-d717-427a-ba6e-8d4ad379150d\") " Oct 01 11:16:02 crc kubenswrapper[4735]: I1001 11:16:02.815252 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m7sj\" (UniqueName: \"kubernetes.io/projected/00ab2086-d717-427a-ba6e-8d4ad379150d-kube-api-access-7m7sj\") pod \"00ab2086-d717-427a-ba6e-8d4ad379150d\" (UID: \"00ab2086-d717-427a-ba6e-8d4ad379150d\") " Oct 01 11:16:02 crc kubenswrapper[4735]: I1001 11:16:02.815576 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ab2086-d717-427a-ba6e-8d4ad379150d-host" (OuterVolumeSpecName: "host") pod "00ab2086-d717-427a-ba6e-8d4ad379150d" (UID: "00ab2086-d717-427a-ba6e-8d4ad379150d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:16:02 crc kubenswrapper[4735]: I1001 11:16:02.816027 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00ab2086-d717-427a-ba6e-8d4ad379150d-host\") on node \"crc\" DevicePath \"\"" Oct 01 11:16:02 crc kubenswrapper[4735]: I1001 11:16:02.823730 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ab2086-d717-427a-ba6e-8d4ad379150d-kube-api-access-7m7sj" (OuterVolumeSpecName: "kube-api-access-7m7sj") pod "00ab2086-d717-427a-ba6e-8d4ad379150d" (UID: "00ab2086-d717-427a-ba6e-8d4ad379150d"). InnerVolumeSpecName "kube-api-access-7m7sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:16:02 crc kubenswrapper[4735]: I1001 11:16:02.918475 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m7sj\" (UniqueName: \"kubernetes.io/projected/00ab2086-d717-427a-ba6e-8d4ad379150d-kube-api-access-7m7sj\") on node \"crc\" DevicePath \"\"" Oct 01 11:16:03 crc kubenswrapper[4735]: I1001 11:16:03.646221 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="903ce35f01f887ec2a2a26a27deb1d942a621921bf4a60d3d1fbc9392da1319e" Oct 01 11:16:03 crc kubenswrapper[4735]: I1001 11:16:03.646309 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-28cnw" Oct 01 11:16:03 crc kubenswrapper[4735]: I1001 11:16:03.915156 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ab2086-d717-427a-ba6e-8d4ad379150d" path="/var/lib/kubelet/pods/00ab2086-d717-427a-ba6e-8d4ad379150d/volumes" Oct 01 11:16:03 crc kubenswrapper[4735]: I1001 11:16:03.971443 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pkp6p/crc-debug-gsjrk"] Oct 01 11:16:03 crc kubenswrapper[4735]: E1001 11:16:03.971789 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ab2086-d717-427a-ba6e-8d4ad379150d" containerName="container-00" Oct 01 11:16:03 crc kubenswrapper[4735]: I1001 11:16:03.971807 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ab2086-d717-427a-ba6e-8d4ad379150d" containerName="container-00" Oct 01 11:16:03 crc kubenswrapper[4735]: E1001 11:16:03.971829 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55d28b6-1a2f-443d-a598-203423486fc3" containerName="collect-profiles" Oct 01 11:16:03 crc kubenswrapper[4735]: I1001 11:16:03.971836 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55d28b6-1a2f-443d-a598-203423486fc3" containerName="collect-profiles" Oct 01 11:16:03 crc kubenswrapper[4735]: I1001 11:16:03.972014 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ab2086-d717-427a-ba6e-8d4ad379150d" containerName="container-00" Oct 01 11:16:03 crc kubenswrapper[4735]: I1001 11:16:03.972028 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b55d28b6-1a2f-443d-a598-203423486fc3" containerName="collect-profiles" Oct 01 11:16:03 crc kubenswrapper[4735]: I1001 11:16:03.972719 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" Oct 01 11:16:03 crc kubenswrapper[4735]: I1001 11:16:03.974249 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pkp6p"/"default-dockercfg-rzrgf" Oct 01 11:16:04 crc kubenswrapper[4735]: I1001 11:16:04.139334 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47r4t\" (UniqueName: \"kubernetes.io/projected/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-kube-api-access-47r4t\") pod \"crc-debug-gsjrk\" (UID: \"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88\") " pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" Oct 01 11:16:04 crc kubenswrapper[4735]: I1001 11:16:04.139395 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-host\") pod \"crc-debug-gsjrk\" (UID: \"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88\") " pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" Oct 01 11:16:04 crc kubenswrapper[4735]: I1001 11:16:04.241166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-host\") pod \"crc-debug-gsjrk\" (UID: \"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88\") " pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" Oct 01 11:16:04 crc kubenswrapper[4735]: I1001 11:16:04.241346 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47r4t\" (UniqueName: \"kubernetes.io/projected/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-kube-api-access-47r4t\") pod \"crc-debug-gsjrk\" (UID: \"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88\") " pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" Oct 01 11:16:04 crc kubenswrapper[4735]: I1001 11:16:04.241742 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-host\") pod \"crc-debug-gsjrk\" (UID: \"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88\") " pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" Oct 01 11:16:04 crc kubenswrapper[4735]: I1001 11:16:04.258469 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47r4t\" (UniqueName: \"kubernetes.io/projected/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-kube-api-access-47r4t\") pod \"crc-debug-gsjrk\" (UID: \"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88\") " pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" Oct 01 11:16:04 crc kubenswrapper[4735]: I1001 11:16:04.295566 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" Oct 01 11:16:04 crc kubenswrapper[4735]: I1001 11:16:04.658583 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" event={"ID":"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88","Type":"ContainerStarted","Data":"80272c7462acbb8b675f70e6d9afc1922785fb4886dc6299b6aa50a7c2bee27f"} Oct 01 11:16:04 crc kubenswrapper[4735]: I1001 11:16:04.658941 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" event={"ID":"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88","Type":"ContainerStarted","Data":"132468f8b7a1546598a321ff6e0698c9996f76452a26478e57fd914808a860a9"} Oct 01 11:16:04 crc kubenswrapper[4735]: I1001 11:16:04.678668 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" podStartSLOduration=1.678649444 podStartE2EDuration="1.678649444s" podCreationTimestamp="2025-10-01 11:16:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:16:04.673838896 +0000 UTC m=+3523.366660158" watchObservedRunningTime="2025-10-01 11:16:04.678649444 +0000 UTC m=+3523.371470696" Oct 01 11:16:05 crc kubenswrapper[4735]: I1001 11:16:05.669559 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d5b6547-a3d1-43e1-9e82-f4b05c52ba88" containerID="80272c7462acbb8b675f70e6d9afc1922785fb4886dc6299b6aa50a7c2bee27f" exitCode=0 Oct 01 11:16:05 crc kubenswrapper[4735]: I1001 11:16:05.669645 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" event={"ID":"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88","Type":"ContainerDied","Data":"80272c7462acbb8b675f70e6d9afc1922785fb4886dc6299b6aa50a7c2bee27f"} Oct 01 11:16:06 crc kubenswrapper[4735]: I1001 11:16:06.765701 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" Oct 01 11:16:06 crc kubenswrapper[4735]: I1001 11:16:06.885791 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47r4t\" (UniqueName: \"kubernetes.io/projected/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-kube-api-access-47r4t\") pod \"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88\" (UID: \"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88\") " Oct 01 11:16:06 crc kubenswrapper[4735]: I1001 11:16:06.885881 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-host\") pod \"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88\" (UID: \"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88\") " Oct 01 11:16:06 crc kubenswrapper[4735]: I1001 11:16:06.886326 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-host" (OuterVolumeSpecName: "host") pod "5d5b6547-a3d1-43e1-9e82-f4b05c52ba88" (UID: "5d5b6547-a3d1-43e1-9e82-f4b05c52ba88"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:16:06 crc kubenswrapper[4735]: I1001 11:16:06.892694 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-kube-api-access-47r4t" (OuterVolumeSpecName: "kube-api-access-47r4t") pod "5d5b6547-a3d1-43e1-9e82-f4b05c52ba88" (UID: "5d5b6547-a3d1-43e1-9e82-f4b05c52ba88"). InnerVolumeSpecName "kube-api-access-47r4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:16:06 crc kubenswrapper[4735]: I1001 11:16:06.987669 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-host\") on node \"crc\" DevicePath \"\"" Oct 01 11:16:06 crc kubenswrapper[4735]: I1001 11:16:06.987696 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47r4t\" (UniqueName: \"kubernetes.io/projected/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88-kube-api-access-47r4t\") on node \"crc\" DevicePath \"\"" Oct 01 11:16:07 crc kubenswrapper[4735]: I1001 11:16:07.689254 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" event={"ID":"5d5b6547-a3d1-43e1-9e82-f4b05c52ba88","Type":"ContainerDied","Data":"132468f8b7a1546598a321ff6e0698c9996f76452a26478e57fd914808a860a9"} Oct 01 11:16:07 crc kubenswrapper[4735]: I1001 11:16:07.690110 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="132468f8b7a1546598a321ff6e0698c9996f76452a26478e57fd914808a860a9" Oct 01 11:16:07 crc kubenswrapper[4735]: I1001 11:16:07.689316 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-gsjrk" Oct 01 11:16:11 crc kubenswrapper[4735]: I1001 11:16:11.675230 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pkp6p/crc-debug-gsjrk"] Oct 01 11:16:11 crc kubenswrapper[4735]: I1001 11:16:11.682135 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pkp6p/crc-debug-gsjrk"] Oct 01 11:16:11 crc kubenswrapper[4735]: I1001 11:16:11.912100 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5b6547-a3d1-43e1-9e82-f4b05c52ba88" path="/var/lib/kubelet/pods/5d5b6547-a3d1-43e1-9e82-f4b05c52ba88/volumes" Oct 01 11:16:12 crc kubenswrapper[4735]: I1001 11:16:12.930248 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pkp6p/crc-debug-qxf6q"] Oct 01 11:16:12 crc kubenswrapper[4735]: E1001 11:16:12.930714 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5b6547-a3d1-43e1-9e82-f4b05c52ba88" containerName="container-00" Oct 01 11:16:12 crc kubenswrapper[4735]: I1001 11:16:12.930729 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5b6547-a3d1-43e1-9e82-f4b05c52ba88" containerName="container-00" Oct 01 11:16:12 crc kubenswrapper[4735]: I1001 11:16:12.930978 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5b6547-a3d1-43e1-9e82-f4b05c52ba88" containerName="container-00" Oct 01 11:16:12 crc kubenswrapper[4735]: I1001 11:16:12.931677 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" Oct 01 11:16:12 crc kubenswrapper[4735]: I1001 11:16:12.934611 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pkp6p"/"default-dockercfg-rzrgf" Oct 01 11:16:12 crc kubenswrapper[4735]: I1001 11:16:12.987532 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7vb7\" (UniqueName: \"kubernetes.io/projected/1d29c345-d6b4-45fd-95b0-f49b2689a207-kube-api-access-b7vb7\") pod \"crc-debug-qxf6q\" (UID: \"1d29c345-d6b4-45fd-95b0-f49b2689a207\") " pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" Oct 01 11:16:12 crc kubenswrapper[4735]: I1001 11:16:12.987664 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d29c345-d6b4-45fd-95b0-f49b2689a207-host\") pod \"crc-debug-qxf6q\" (UID: \"1d29c345-d6b4-45fd-95b0-f49b2689a207\") " pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" Oct 01 11:16:13 crc kubenswrapper[4735]: I1001 11:16:13.089693 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d29c345-d6b4-45fd-95b0-f49b2689a207-host\") pod \"crc-debug-qxf6q\" (UID: \"1d29c345-d6b4-45fd-95b0-f49b2689a207\") " pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" Oct 01 11:16:13 crc kubenswrapper[4735]: I1001 11:16:13.089861 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d29c345-d6b4-45fd-95b0-f49b2689a207-host\") pod \"crc-debug-qxf6q\" (UID: \"1d29c345-d6b4-45fd-95b0-f49b2689a207\") " pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" Oct 01 11:16:13 crc kubenswrapper[4735]: I1001 11:16:13.090314 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7vb7\" (UniqueName: \"kubernetes.io/projected/1d29c345-d6b4-45fd-95b0-f49b2689a207-kube-api-access-b7vb7\") pod \"crc-debug-qxf6q\" (UID: \"1d29c345-d6b4-45fd-95b0-f49b2689a207\") " pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" Oct 01 11:16:13 crc kubenswrapper[4735]: I1001 11:16:13.110689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7vb7\" (UniqueName: \"kubernetes.io/projected/1d29c345-d6b4-45fd-95b0-f49b2689a207-kube-api-access-b7vb7\") pod \"crc-debug-qxf6q\" (UID: \"1d29c345-d6b4-45fd-95b0-f49b2689a207\") " pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" Oct 01 11:16:13 crc kubenswrapper[4735]: I1001 11:16:13.254555 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" Oct 01 11:16:13 crc kubenswrapper[4735]: I1001 11:16:13.748744 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" event={"ID":"1d29c345-d6b4-45fd-95b0-f49b2689a207","Type":"ContainerDied","Data":"184f70b8f3fc946a88749822171c17fdae25bf19d2127a92efae2a4bd6ae302d"} Oct 01 11:16:13 crc kubenswrapper[4735]: I1001 11:16:13.748635 4735 generic.go:334] "Generic (PLEG): container finished" podID="1d29c345-d6b4-45fd-95b0-f49b2689a207" containerID="184f70b8f3fc946a88749822171c17fdae25bf19d2127a92efae2a4bd6ae302d" exitCode=0 Oct 01 11:16:13 crc kubenswrapper[4735]: I1001 11:16:13.749301 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" event={"ID":"1d29c345-d6b4-45fd-95b0-f49b2689a207","Type":"ContainerStarted","Data":"137f3f6ba3f6f5294e3ed7ecedb6f922c644975d81fe2584cf9ddbaba21550e2"} Oct 01 11:16:13 crc kubenswrapper[4735]: I1001 11:16:13.823114 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pkp6p/crc-debug-qxf6q"] Oct 01 11:16:13 crc kubenswrapper[4735]: I1001 11:16:13.835909 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pkp6p/crc-debug-qxf6q"] Oct 01 11:16:14 crc kubenswrapper[4735]: I1001 11:16:14.864570 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" Oct 01 11:16:14 crc kubenswrapper[4735]: I1001 11:16:14.936410 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d29c345-d6b4-45fd-95b0-f49b2689a207-host\") pod \"1d29c345-d6b4-45fd-95b0-f49b2689a207\" (UID: \"1d29c345-d6b4-45fd-95b0-f49b2689a207\") " Oct 01 11:16:14 crc kubenswrapper[4735]: I1001 11:16:14.936521 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d29c345-d6b4-45fd-95b0-f49b2689a207-host" (OuterVolumeSpecName: "host") pod "1d29c345-d6b4-45fd-95b0-f49b2689a207" (UID: "1d29c345-d6b4-45fd-95b0-f49b2689a207"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:16:14 crc kubenswrapper[4735]: I1001 11:16:14.936598 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7vb7\" (UniqueName: \"kubernetes.io/projected/1d29c345-d6b4-45fd-95b0-f49b2689a207-kube-api-access-b7vb7\") pod \"1d29c345-d6b4-45fd-95b0-f49b2689a207\" (UID: \"1d29c345-d6b4-45fd-95b0-f49b2689a207\") " Oct 01 11:16:14 crc kubenswrapper[4735]: I1001 11:16:14.937099 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d29c345-d6b4-45fd-95b0-f49b2689a207-host\") on node \"crc\" DevicePath \"\"" Oct 01 11:16:14 crc kubenswrapper[4735]: I1001 11:16:14.943713 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d29c345-d6b4-45fd-95b0-f49b2689a207-kube-api-access-b7vb7" (OuterVolumeSpecName: "kube-api-access-b7vb7") pod "1d29c345-d6b4-45fd-95b0-f49b2689a207" (UID: "1d29c345-d6b4-45fd-95b0-f49b2689a207"). InnerVolumeSpecName "kube-api-access-b7vb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.039733 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7vb7\" (UniqueName: \"kubernetes.io/projected/1d29c345-d6b4-45fd-95b0-f49b2689a207-kube-api-access-b7vb7\") on node \"crc\" DevicePath \"\"" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.351453 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/util/0.log" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.515878 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/pull/0.log" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.528487 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/util/0.log" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.549216 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/pull/0.log" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.680372 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/util/0.log" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.709915 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/extract/0.log" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.725797 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/pull/0.log" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.767238 4735 scope.go:117] "RemoveContainer" containerID="184f70b8f3fc946a88749822171c17fdae25bf19d2127a92efae2a4bd6ae302d" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.767258 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/crc-debug-qxf6q" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.908640 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d29c345-d6b4-45fd-95b0-f49b2689a207" path="/var/lib/kubelet/pods/1d29c345-d6b4-45fd-95b0-f49b2689a207/volumes" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.916117 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-v88gl_27b0c660-ca46-48cb-88ca-bb5715532c80/manager/0.log" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.918263 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-v88gl_27b0c660-ca46-48cb-88ca-bb5715532c80/kube-rbac-proxy/0.log" Oct 01 11:16:15 crc kubenswrapper[4735]: I1001 11:16:15.938155 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-l5lmn_db6d8b12-0a21-40a5-b23a-e943494a2091/kube-rbac-proxy/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.098635 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-l5lmn_db6d8b12-0a21-40a5-b23a-e943494a2091/manager/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.108983 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-fj5j6_263d49ed-e577-457b-b887-33f95f1bbed0/kube-rbac-proxy/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.117193 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-fj5j6_263d49ed-e577-457b-b887-33f95f1bbed0/manager/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.278996 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-hgmvf_bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98/kube-rbac-proxy/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.329758 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-hgmvf_bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98/manager/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.485932 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-qp9c7_c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9/manager/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.499329 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-qp9c7_c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9/kube-rbac-proxy/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.552975 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-zsvgx_7df184a1-eb46-4e19-85cd-82e8d5da1880/kube-rbac-proxy/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.693884 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-zsvgx_7df184a1-eb46-4e19-85cd-82e8d5da1880/manager/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.733778 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-txp8d_a48cc547-6994-48eb-b8db-9682c091fdac/kube-rbac-proxy/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.858369 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-txp8d_a48cc547-6994-48eb-b8db-9682c091fdac/manager/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.920726 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-hw4lp_8a30378d-5948-4a39-b5ec-85b29f5763e9/kube-rbac-proxy/0.log" Oct 01 11:16:16 crc kubenswrapper[4735]: I1001 11:16:16.967409 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-hw4lp_8a30378d-5948-4a39-b5ec-85b29f5763e9/manager/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.104600 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-wftzg_628c079b-418b-4299-9394-a59ab2850d23/kube-rbac-proxy/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.172648 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-wftzg_628c079b-418b-4299-9394-a59ab2850d23/manager/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.199375 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-7hcv5_a1e6fcf2-bfad-48fe-b655-0e9199818230/kube-rbac-proxy/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.317566 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-7hcv5_a1e6fcf2-bfad-48fe-b655-0e9199818230/manager/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.337111 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-m5vc9_912f537c-db9b-4256-a5e0-81dc33bcaf3e/kube-rbac-proxy/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.433919 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-m5vc9_912f537c-db9b-4256-a5e0-81dc33bcaf3e/manager/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.524286 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-9k2wg_663344ae-dd00-416c-9120-d4f0721554b4/kube-rbac-proxy/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.612424 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-9k2wg_663344ae-dd00-416c-9120-d4f0721554b4/manager/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.697675 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-mr6vv_3fd0c004-178e-41cb-be27-2d2342d9f58c/kube-rbac-proxy/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.807595 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-mr6vv_3fd0c004-178e-41cb-be27-2d2342d9f58c/manager/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.840185 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-lg9mq_6b1d4e62-3eb6-4090-826d-e627e08c73c6/kube-rbac-proxy/0.log" Oct 01 11:16:17 crc kubenswrapper[4735]: I1001 11:16:17.879400 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-lg9mq_6b1d4e62-3eb6-4090-826d-e627e08c73c6/manager/0.log" Oct 01 11:16:18 crc kubenswrapper[4735]: I1001 11:16:18.005015 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cllzwc_ef40c559-9b0d-478a-baa4-239ab6d71d76/manager/0.log" Oct 01 11:16:18 crc kubenswrapper[4735]: I1001 11:16:18.005335 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cllzwc_ef40c559-9b0d-478a-baa4-239ab6d71d76/kube-rbac-proxy/0.log" Oct 01 11:16:18 crc kubenswrapper[4735]: I1001 11:16:18.135943 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b86d7dbdd-k9n2c_86607e6f-a912-47f7-b72c-8ea925c5bd53/kube-rbac-proxy/0.log" Oct 01 11:16:18 crc kubenswrapper[4735]: I1001 11:16:18.317812 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-794c859bbc-8kzp5_12603a12-a744-42f1-b0fd-e1a88e490d81/kube-rbac-proxy/0.log" Oct 01 11:16:18 crc kubenswrapper[4735]: I1001 11:16:18.480736 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jmdr7_bf036ed7-e2ad-407c-94d5-ce386d9884b8/registry-server/0.log" Oct 01 11:16:18 crc kubenswrapper[4735]: I1001 11:16:18.484520 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-794c859bbc-8kzp5_12603a12-a744-42f1-b0fd-e1a88e490d81/operator/0.log" Oct 01 11:16:18 crc kubenswrapper[4735]: I1001 11:16:18.633932 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-2pfnh_ef5f656f-602f-475c-8bd2-078c0bb43388/kube-rbac-proxy/0.log" Oct 01 11:16:18 crc kubenswrapper[4735]: I1001 11:16:18.766600 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-2pfnh_ef5f656f-602f-475c-8bd2-078c0bb43388/manager/0.log" Oct 01 11:16:18 crc kubenswrapper[4735]: I1001 11:16:18.858304 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-p8c9g_9fba9aa7-25bd-4d48-89e9-818af62e38af/kube-rbac-proxy/0.log" Oct 01 11:16:18 crc kubenswrapper[4735]: I1001 11:16:18.887714 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-p8c9g_9fba9aa7-25bd-4d48-89e9-818af62e38af/manager/0.log" Oct 01 11:16:19 crc kubenswrapper[4735]: I1001 11:16:19.061948 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p_d285f86e-bf4c-4e84-8b59-039754ffb39c/operator/0.log" Oct 01 11:16:19 crc kubenswrapper[4735]: I1001 11:16:19.130447 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-j6dlz_50560653-c3f8-4fa8-9d19-a1525b1daaa2/kube-rbac-proxy/0.log" Oct 01 11:16:19 crc kubenswrapper[4735]: I1001 11:16:19.220999 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-j6dlz_50560653-c3f8-4fa8-9d19-a1525b1daaa2/manager/0.log" Oct 01 11:16:19 crc kubenswrapper[4735]: I1001 11:16:19.278470 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-svvjg_82c5ecde-4ff5-42bc-9956-45d025d53f45/kube-rbac-proxy/0.log" Oct 01 11:16:19 crc kubenswrapper[4735]: I1001 11:16:19.281184 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b86d7dbdd-k9n2c_86607e6f-a912-47f7-b72c-8ea925c5bd53/manager/0.log" Oct 01 11:16:19 crc kubenswrapper[4735]: I1001 11:16:19.370844 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-svvjg_82c5ecde-4ff5-42bc-9956-45d025d53f45/manager/0.log" Oct 01 11:16:19 crc kubenswrapper[4735]: I1001 11:16:19.450269 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-5w5cv_dfaa86bf-1e61-4393-ba0f-b9003fdbde80/manager/0.log" Oct 01 11:16:19 crc kubenswrapper[4735]: I1001 11:16:19.453308 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-5w5cv_dfaa86bf-1e61-4393-ba0f-b9003fdbde80/kube-rbac-proxy/0.log" Oct 01 11:16:19 crc kubenswrapper[4735]: I1001 11:16:19.540155 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-f29js_c4ac4b8f-7378-438b-8412-1b74d1c3fda9/kube-rbac-proxy/0.log" Oct 01 11:16:19 crc kubenswrapper[4735]: I1001 11:16:19.608247 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-f29js_c4ac4b8f-7378-438b-8412-1b74d1c3fda9/manager/0.log" Oct 01 11:16:34 crc kubenswrapper[4735]: I1001 11:16:34.095329 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-76rns_1d4a525c-7b1a-4bde-976a-d4b938c27209/control-plane-machine-set-operator/0.log" Oct 01 11:16:34 crc kubenswrapper[4735]: I1001 11:16:34.229843 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kgr86_6963210d-abf6-43ad-80ea-72831b6d7504/kube-rbac-proxy/0.log" Oct 01 11:16:34 crc kubenswrapper[4735]: I1001 11:16:34.274576 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kgr86_6963210d-abf6-43ad-80ea-72831b6d7504/machine-api-operator/0.log" Oct 01 11:16:45 crc kubenswrapper[4735]: I1001 11:16:45.887780 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-srpsm_4a6804d5-21c5-4d3d-9504-d769df881c52/cert-manager-controller/0.log" Oct 01 11:16:46 crc kubenswrapper[4735]: I1001 11:16:46.106691 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-mvxdt_63d28348-4347-431a-97e8-3526e9f66a68/cert-manager-cainjector/0.log" Oct 01 11:16:46 crc kubenswrapper[4735]: I1001 11:16:46.142712 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-d2xk7_a4f1a7ac-f922-4a82-8675-c87e0921512f/cert-manager-webhook/0.log" Oct 01 11:16:57 crc kubenswrapper[4735]: I1001 11:16:57.356665 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-svjp6_cb894fbc-36ef-4c41-ae21-dff369c41c99/nmstate-console-plugin/0.log" Oct 01 11:16:57 crc kubenswrapper[4735]: I1001 11:16:57.541884 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-mpptn_6f89445f-bbda-4a4e-8cc5-ceb03718ffed/kube-rbac-proxy/0.log" Oct 01 11:16:57 crc kubenswrapper[4735]: I1001 11:16:57.543018 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-p4k5f_b82af7f6-7fb7-4e7a-9787-1f3b84969763/nmstate-handler/0.log" Oct 01 11:16:57 crc kubenswrapper[4735]: I1001 11:16:57.634943 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-mpptn_6f89445f-bbda-4a4e-8cc5-ceb03718ffed/nmstate-metrics/0.log" Oct 01 11:16:57 crc kubenswrapper[4735]: I1001 11:16:57.742581 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-m6pnj_7da3bf5b-a383-430c-b587-62c7eabeedd1/nmstate-operator/0.log" Oct 01 11:16:57 crc kubenswrapper[4735]: I1001 11:16:57.845427 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-xsrh7_f37e734e-18a9-4b41-b06d-1da35b2d5654/nmstate-webhook/0.log" Oct 01 11:17:05 crc kubenswrapper[4735]: I1001 11:17:05.486108 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:17:05 crc kubenswrapper[4735]: I1001 11:17:05.486972 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:17:11 crc kubenswrapper[4735]: I1001 11:17:11.357562 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-6nnhj_d76a5dcc-a5c5-435c-9dfc-11bab4a422e9/kube-rbac-proxy/0.log" Oct 01 11:17:11 crc kubenswrapper[4735]: I1001 11:17:11.587416 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-6nnhj_d76a5dcc-a5c5-435c-9dfc-11bab4a422e9/controller/0.log" Oct 01 11:17:11 crc kubenswrapper[4735]: I1001 11:17:11.636489 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-frr-files/0.log" Oct 01 11:17:11 crc kubenswrapper[4735]: I1001 11:17:11.802032 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-reloader/0.log" Oct 01 11:17:11 crc kubenswrapper[4735]: I1001 11:17:11.811355 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-metrics/0.log" Oct 01 11:17:11 crc kubenswrapper[4735]: I1001 11:17:11.844807 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-reloader/0.log" Oct 01 11:17:11 crc kubenswrapper[4735]: I1001 11:17:11.857574 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-frr-files/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.007900 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-frr-files/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.018320 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-reloader/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.042942 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-metrics/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.099401 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-metrics/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.195673 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-frr-files/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.220765 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-reloader/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.223274 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-metrics/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.310398 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/controller/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.511919 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/frr-metrics/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.600709 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/kube-rbac-proxy/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.684254 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/kube-rbac-proxy-frr/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.760438 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/reloader/0.log" Oct 01 11:17:12 crc kubenswrapper[4735]: I1001 11:17:12.886821 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-cgg5l_d45651e4-469d-458c-9d48-ad996f82c3f0/frr-k8s-webhook-server/0.log" Oct 01 11:17:13 crc kubenswrapper[4735]: I1001 11:17:13.089866 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-755f8bc9ff-4w5jq_a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9/manager/0.log" Oct 01 11:17:13 crc kubenswrapper[4735]: I1001 11:17:13.232427 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5455ff795f-xpx6b_db9781ea-0490-415d-8e5b-7b64d4aa62dd/webhook-server/0.log" Oct 01 11:17:13 crc kubenswrapper[4735]: I1001 11:17:13.329053 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9c82t_ea017fc5-1856-47da-99a5-c866738be35e/kube-rbac-proxy/0.log" Oct 01 11:17:13 crc kubenswrapper[4735]: I1001 11:17:13.810415 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/frr/0.log" Oct 01 11:17:13 crc kubenswrapper[4735]: I1001 11:17:13.860541 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9c82t_ea017fc5-1856-47da-99a5-c866738be35e/speaker/0.log" Oct 01 11:17:26 crc kubenswrapper[4735]: I1001 11:17:26.727659 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/util/0.log" Oct 01 11:17:26 crc kubenswrapper[4735]: I1001 11:17:26.906369 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/util/0.log" Oct 01 11:17:26 crc kubenswrapper[4735]: I1001 11:17:26.922254 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/pull/0.log" Oct 01 11:17:26 crc kubenswrapper[4735]: I1001 11:17:26.968565 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/pull/0.log" Oct 01 11:17:27 crc kubenswrapper[4735]: I1001 11:17:27.124492 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/extract/0.log" Oct 01 11:17:27 crc kubenswrapper[4735]: I1001 11:17:27.132859 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/util/0.log" Oct 01 11:17:27 crc kubenswrapper[4735]: I1001 11:17:27.143807 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/pull/0.log" Oct 01 11:17:27 crc kubenswrapper[4735]: I1001 11:17:27.307112 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-utilities/0.log" Oct 01 11:17:27 crc kubenswrapper[4735]: I1001 11:17:27.468354 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-utilities/0.log" Oct 01 11:17:27 crc kubenswrapper[4735]: I1001 11:17:27.470724 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-content/0.log" Oct 01 11:17:27 crc kubenswrapper[4735]: I1001 11:17:27.480388 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-content/0.log" Oct 01 11:17:27 crc kubenswrapper[4735]: I1001 11:17:27.860513 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-utilities/0.log" Oct 01 11:17:27 crc kubenswrapper[4735]: I1001 11:17:27.880686 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-content/0.log" Oct 01 11:17:27 crc kubenswrapper[4735]: I1001 11:17:27.972379 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-utilities/0.log" Oct 01 11:17:28 crc kubenswrapper[4735]: I1001 11:17:28.252431 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/registry-server/0.log" Oct 01 11:17:28 crc kubenswrapper[4735]: I1001 11:17:28.267068 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-utilities/0.log" Oct 01 11:17:28 crc kubenswrapper[4735]: I1001 11:17:28.305987 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-content/0.log" Oct 01 11:17:28 crc kubenswrapper[4735]: I1001 11:17:28.329038 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-content/0.log" Oct 01 11:17:28 crc kubenswrapper[4735]: I1001 11:17:28.509955 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-content/0.log" Oct 01 11:17:28 crc kubenswrapper[4735]: I1001 11:17:28.510100 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-utilities/0.log" Oct 01 11:17:28 crc kubenswrapper[4735]: I1001 11:17:28.729054 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/util/0.log" Oct 01 11:17:28 crc kubenswrapper[4735]: I1001 11:17:28.988395 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/pull/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.049841 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/util/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.054083 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/registry-server/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.058556 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/pull/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.165042 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/util/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.220418 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/extract/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.246025 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/pull/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.371599 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-twfwh_94467536-0aa2-426e-a14a-bb05c8afd56c/marketplace-operator/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.422289 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-utilities/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.631955 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-content/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.669413 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-content/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.678168 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-utilities/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.901642 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-content/0.log" Oct 01 11:17:29 crc kubenswrapper[4735]: I1001 11:17:29.920597 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-utilities/0.log" Oct 01 11:17:30 crc kubenswrapper[4735]: I1001 11:17:30.027050 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/registry-server/0.log" Oct 01 11:17:30 crc kubenswrapper[4735]: I1001 11:17:30.092444 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-utilities/0.log" Oct 01 11:17:30 crc kubenswrapper[4735]: I1001 11:17:30.245802 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-utilities/0.log" Oct 01 11:17:30 crc kubenswrapper[4735]: I1001 11:17:30.253567 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-content/0.log" Oct 01 11:17:30 crc kubenswrapper[4735]: I1001 11:17:30.317782 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-content/0.log" Oct 01 11:17:30 crc kubenswrapper[4735]: I1001 11:17:30.443386 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-utilities/0.log" Oct 01 11:17:30 crc kubenswrapper[4735]: I1001 11:17:30.469216 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-content/0.log" Oct 01 11:17:30 crc kubenswrapper[4735]: I1001 11:17:30.856088 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/registry-server/0.log" Oct 01 11:17:35 crc kubenswrapper[4735]: I1001 11:17:35.485449 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:17:35 crc kubenswrapper[4735]: I1001 11:17:35.486454 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.573984 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pz7xn"] Oct 01 11:17:48 crc kubenswrapper[4735]: E1001 11:17:48.575022 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d29c345-d6b4-45fd-95b0-f49b2689a207" containerName="container-00" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.575038 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d29c345-d6b4-45fd-95b0-f49b2689a207" containerName="container-00" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.575235 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d29c345-d6b4-45fd-95b0-f49b2689a207" containerName="container-00" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.576692 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.588258 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-catalog-content\") pod \"redhat-marketplace-pz7xn\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.588458 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhhtb\" (UniqueName: \"kubernetes.io/projected/5ecb72b6-6633-46dc-9547-f7c9f0064439-kube-api-access-bhhtb\") pod \"redhat-marketplace-pz7xn\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.588678 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-utilities\") pod \"redhat-marketplace-pz7xn\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.591187 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz7xn"] Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.691002 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-catalog-content\") pod \"redhat-marketplace-pz7xn\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.691313 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhhtb\" (UniqueName: \"kubernetes.io/projected/5ecb72b6-6633-46dc-9547-f7c9f0064439-kube-api-access-bhhtb\") pod \"redhat-marketplace-pz7xn\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.691411 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-utilities\") pod \"redhat-marketplace-pz7xn\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.691449 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-catalog-content\") pod \"redhat-marketplace-pz7xn\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.691935 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-utilities\") pod \"redhat-marketplace-pz7xn\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.716990 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhhtb\" (UniqueName: \"kubernetes.io/projected/5ecb72b6-6633-46dc-9547-f7c9f0064439-kube-api-access-bhhtb\") pod \"redhat-marketplace-pz7xn\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:48 crc kubenswrapper[4735]: I1001 11:17:48.906213 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:49 crc kubenswrapper[4735]: I1001 11:17:49.437476 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz7xn"] Oct 01 11:17:49 crc kubenswrapper[4735]: I1001 11:17:49.643931 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz7xn" event={"ID":"5ecb72b6-6633-46dc-9547-f7c9f0064439","Type":"ContainerStarted","Data":"f8af8c1d1bf890ddf383826502e276827e664de306c7741f2e9c4081e7b2cd9a"} Oct 01 11:17:50 crc kubenswrapper[4735]: I1001 11:17:50.653110 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ecb72b6-6633-46dc-9547-f7c9f0064439" containerID="eb2db0310c7363d6f4cdf8b18cf3c0cce967ccef08d1ef21f1b8ce8576574bad" exitCode=0 Oct 01 11:17:50 crc kubenswrapper[4735]: I1001 11:17:50.653341 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz7xn" event={"ID":"5ecb72b6-6633-46dc-9547-f7c9f0064439","Type":"ContainerDied","Data":"eb2db0310c7363d6f4cdf8b18cf3c0cce967ccef08d1ef21f1b8ce8576574bad"} Oct 01 11:17:50 crc kubenswrapper[4735]: I1001 11:17:50.655689 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 11:17:51 crc kubenswrapper[4735]: I1001 11:17:51.664826 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ecb72b6-6633-46dc-9547-f7c9f0064439" containerID="b23e229599afd785f3e7979942f813f924891304c07d51a92a142b4cf88fc35c" exitCode=0 Oct 01 11:17:51 crc kubenswrapper[4735]: I1001 11:17:51.664894 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz7xn" event={"ID":"5ecb72b6-6633-46dc-9547-f7c9f0064439","Type":"ContainerDied","Data":"b23e229599afd785f3e7979942f813f924891304c07d51a92a142b4cf88fc35c"} Oct 01 11:17:52 crc kubenswrapper[4735]: I1001 11:17:52.682353 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz7xn" event={"ID":"5ecb72b6-6633-46dc-9547-f7c9f0064439","Type":"ContainerStarted","Data":"2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70"} Oct 01 11:17:58 crc kubenswrapper[4735]: I1001 11:17:58.907405 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:58 crc kubenswrapper[4735]: I1001 11:17:58.909405 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:58 crc kubenswrapper[4735]: I1001 11:17:58.963538 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:58 crc kubenswrapper[4735]: I1001 11:17:58.992019 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pz7xn" podStartSLOduration=9.276904873 podStartE2EDuration="10.991997029s" podCreationTimestamp="2025-10-01 11:17:48 +0000 UTC" firstStartedPulling="2025-10-01 11:17:50.65545662 +0000 UTC m=+3629.348277882" lastFinishedPulling="2025-10-01 11:17:52.370548766 +0000 UTC m=+3631.063370038" observedRunningTime="2025-10-01 11:17:52.715725121 +0000 UTC m=+3631.408546383" watchObservedRunningTime="2025-10-01 11:17:58.991997029 +0000 UTC m=+3637.684818291" Oct 01 11:17:59 crc kubenswrapper[4735]: I1001 11:17:59.784378 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:17:59 crc kubenswrapper[4735]: I1001 11:17:59.835173 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz7xn"] Oct 01 11:18:01 crc kubenswrapper[4735]: I1001 11:18:01.751601 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pz7xn" podUID="5ecb72b6-6633-46dc-9547-f7c9f0064439" containerName="registry-server" containerID="cri-o://2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70" gracePeriod=2 Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.262693 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.282392 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhhtb\" (UniqueName: \"kubernetes.io/projected/5ecb72b6-6633-46dc-9547-f7c9f0064439-kube-api-access-bhhtb\") pod \"5ecb72b6-6633-46dc-9547-f7c9f0064439\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.282436 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-utilities\") pod \"5ecb72b6-6633-46dc-9547-f7c9f0064439\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.282534 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-catalog-content\") pod \"5ecb72b6-6633-46dc-9547-f7c9f0064439\" (UID: \"5ecb72b6-6633-46dc-9547-f7c9f0064439\") " Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.283933 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-utilities" (OuterVolumeSpecName: "utilities") pod "5ecb72b6-6633-46dc-9547-f7c9f0064439" (UID: "5ecb72b6-6633-46dc-9547-f7c9f0064439"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.289468 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ecb72b6-6633-46dc-9547-f7c9f0064439-kube-api-access-bhhtb" (OuterVolumeSpecName: "kube-api-access-bhhtb") pod "5ecb72b6-6633-46dc-9547-f7c9f0064439" (UID: "5ecb72b6-6633-46dc-9547-f7c9f0064439"). InnerVolumeSpecName "kube-api-access-bhhtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.308878 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ecb72b6-6633-46dc-9547-f7c9f0064439" (UID: "5ecb72b6-6633-46dc-9547-f7c9f0064439"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.384126 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhhtb\" (UniqueName: \"kubernetes.io/projected/5ecb72b6-6633-46dc-9547-f7c9f0064439-kube-api-access-bhhtb\") on node \"crc\" DevicePath \"\"" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.384155 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.384163 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ecb72b6-6633-46dc-9547-f7c9f0064439-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.776421 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ecb72b6-6633-46dc-9547-f7c9f0064439" containerID="2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70" exitCode=0 Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.776478 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz7xn" event={"ID":"5ecb72b6-6633-46dc-9547-f7c9f0064439","Type":"ContainerDied","Data":"2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70"} Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.776551 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz7xn" event={"ID":"5ecb72b6-6633-46dc-9547-f7c9f0064439","Type":"ContainerDied","Data":"f8af8c1d1bf890ddf383826502e276827e664de306c7741f2e9c4081e7b2cd9a"} Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.776580 4735 scope.go:117] "RemoveContainer" containerID="2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.776614 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz7xn" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.802288 4735 scope.go:117] "RemoveContainer" containerID="b23e229599afd785f3e7979942f813f924891304c07d51a92a142b4cf88fc35c" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.825254 4735 scope.go:117] "RemoveContainer" containerID="eb2db0310c7363d6f4cdf8b18cf3c0cce967ccef08d1ef21f1b8ce8576574bad" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.846260 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz7xn"] Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.857895 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz7xn"] Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.872834 4735 scope.go:117] "RemoveContainer" containerID="2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70" Oct 01 11:18:02 crc kubenswrapper[4735]: E1001 11:18:02.873224 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70\": container with ID starting with 2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70 not found: ID does not exist" containerID="2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.873257 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70"} err="failed to get container status \"2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70\": rpc error: code = NotFound desc = could not find container \"2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70\": container with ID starting with 2974029d99aab0efb6e0f7e02e3085e39c784ac218d909c19c4335ad09df6d70 not found: ID does not exist" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.873276 4735 scope.go:117] "RemoveContainer" containerID="b23e229599afd785f3e7979942f813f924891304c07d51a92a142b4cf88fc35c" Oct 01 11:18:02 crc kubenswrapper[4735]: E1001 11:18:02.873572 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b23e229599afd785f3e7979942f813f924891304c07d51a92a142b4cf88fc35c\": container with ID starting with b23e229599afd785f3e7979942f813f924891304c07d51a92a142b4cf88fc35c not found: ID does not exist" containerID="b23e229599afd785f3e7979942f813f924891304c07d51a92a142b4cf88fc35c" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.873636 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b23e229599afd785f3e7979942f813f924891304c07d51a92a142b4cf88fc35c"} err="failed to get container status \"b23e229599afd785f3e7979942f813f924891304c07d51a92a142b4cf88fc35c\": rpc error: code = NotFound desc = could not find container \"b23e229599afd785f3e7979942f813f924891304c07d51a92a142b4cf88fc35c\": container with ID starting with b23e229599afd785f3e7979942f813f924891304c07d51a92a142b4cf88fc35c not found: ID does not exist" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.873692 4735 scope.go:117] "RemoveContainer" containerID="eb2db0310c7363d6f4cdf8b18cf3c0cce967ccef08d1ef21f1b8ce8576574bad" Oct 01 11:18:02 crc kubenswrapper[4735]: E1001 11:18:02.874161 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2db0310c7363d6f4cdf8b18cf3c0cce967ccef08d1ef21f1b8ce8576574bad\": container with ID starting with eb2db0310c7363d6f4cdf8b18cf3c0cce967ccef08d1ef21f1b8ce8576574bad not found: ID does not exist" containerID="eb2db0310c7363d6f4cdf8b18cf3c0cce967ccef08d1ef21f1b8ce8576574bad" Oct 01 11:18:02 crc kubenswrapper[4735]: I1001 11:18:02.874190 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2db0310c7363d6f4cdf8b18cf3c0cce967ccef08d1ef21f1b8ce8576574bad"} err="failed to get container status \"eb2db0310c7363d6f4cdf8b18cf3c0cce967ccef08d1ef21f1b8ce8576574bad\": rpc error: code = NotFound desc = could not find container \"eb2db0310c7363d6f4cdf8b18cf3c0cce967ccef08d1ef21f1b8ce8576574bad\": container with ID starting with eb2db0310c7363d6f4cdf8b18cf3c0cce967ccef08d1ef21f1b8ce8576574bad not found: ID does not exist" Oct 01 11:18:03 crc kubenswrapper[4735]: I1001 11:18:03.912622 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ecb72b6-6633-46dc-9547-f7c9f0064439" path="/var/lib/kubelet/pods/5ecb72b6-6633-46dc-9547-f7c9f0064439/volumes" Oct 01 11:18:05 crc kubenswrapper[4735]: I1001 11:18:05.485979 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:18:05 crc kubenswrapper[4735]: I1001 11:18:05.486391 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:18:05 crc kubenswrapper[4735]: I1001 11:18:05.486449 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 11:18:05 crc kubenswrapper[4735]: I1001 11:18:05.487469 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:18:05 crc kubenswrapper[4735]: I1001 11:18:05.487616 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" gracePeriod=600 Oct 01 11:18:05 crc kubenswrapper[4735]: E1001 11:18:05.616330 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:18:05 crc kubenswrapper[4735]: I1001 11:18:05.815026 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" exitCode=0 Oct 01 11:18:05 crc kubenswrapper[4735]: I1001 11:18:05.815085 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53"} Oct 01 11:18:05 crc kubenswrapper[4735]: I1001 11:18:05.815132 4735 scope.go:117] "RemoveContainer" containerID="7c0fff2ad8c11c90072972b08d9ebb58c252453d2fef2077d87db58eb0f716d0" Oct 01 11:18:05 crc kubenswrapper[4735]: I1001 11:18:05.816601 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:18:05 crc kubenswrapper[4735]: E1001 11:18:05.819078 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:18:19 crc kubenswrapper[4735]: I1001 11:18:19.897215 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:18:19 crc kubenswrapper[4735]: E1001 11:18:19.898008 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:18:32 crc kubenswrapper[4735]: I1001 11:18:32.913071 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:18:32 crc kubenswrapper[4735]: E1001 11:18:32.914195 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:18:46 crc kubenswrapper[4735]: I1001 11:18:46.897372 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:18:46 crc kubenswrapper[4735]: E1001 11:18:46.899768 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:18:58 crc kubenswrapper[4735]: I1001 11:18:58.898535 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:18:58 crc kubenswrapper[4735]: E1001 11:18:58.899422 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:19:13 crc kubenswrapper[4735]: I1001 11:19:13.897277 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:19:13 crc kubenswrapper[4735]: E1001 11:19:13.899697 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:19:22 crc kubenswrapper[4735]: I1001 11:19:22.714564 4735 generic.go:334] "Generic (PLEG): container finished" podID="a6a5fa64-2376-48c5-9199-b25978c0cd0e" containerID="bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a" exitCode=0 Oct 01 11:19:22 crc kubenswrapper[4735]: I1001 11:19:22.715690 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pkp6p/must-gather-6s4pg" event={"ID":"a6a5fa64-2376-48c5-9199-b25978c0cd0e","Type":"ContainerDied","Data":"bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a"} Oct 01 11:19:22 crc kubenswrapper[4735]: I1001 11:19:22.716452 4735 scope.go:117] "RemoveContainer" containerID="bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a" Oct 01 11:19:23 crc kubenswrapper[4735]: I1001 11:19:23.057739 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pkp6p_must-gather-6s4pg_a6a5fa64-2376-48c5-9199-b25978c0cd0e/gather/0.log" Oct 01 11:19:24 crc kubenswrapper[4735]: I1001 11:19:24.896476 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:19:24 crc kubenswrapper[4735]: E1001 11:19:24.896957 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.054162 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pkp6p/must-gather-6s4pg"] Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.055027 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pkp6p/must-gather-6s4pg" podUID="a6a5fa64-2376-48c5-9199-b25978c0cd0e" containerName="copy" containerID="cri-o://b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38" gracePeriod=2 Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.060943 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pkp6p/must-gather-6s4pg"] Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.594628 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pkp6p_must-gather-6s4pg_a6a5fa64-2376-48c5-9199-b25978c0cd0e/copy/0.log" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.595617 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/must-gather-6s4pg" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.710906 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6chw\" (UniqueName: \"kubernetes.io/projected/a6a5fa64-2376-48c5-9199-b25978c0cd0e-kube-api-access-q6chw\") pod \"a6a5fa64-2376-48c5-9199-b25978c0cd0e\" (UID: \"a6a5fa64-2376-48c5-9199-b25978c0cd0e\") " Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.710983 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6a5fa64-2376-48c5-9199-b25978c0cd0e-must-gather-output\") pod \"a6a5fa64-2376-48c5-9199-b25978c0cd0e\" (UID: \"a6a5fa64-2376-48c5-9199-b25978c0cd0e\") " Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.719395 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a5fa64-2376-48c5-9199-b25978c0cd0e-kube-api-access-q6chw" (OuterVolumeSpecName: "kube-api-access-q6chw") pod "a6a5fa64-2376-48c5-9199-b25978c0cd0e" (UID: "a6a5fa64-2376-48c5-9199-b25978c0cd0e"). InnerVolumeSpecName "kube-api-access-q6chw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.815389 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6chw\" (UniqueName: \"kubernetes.io/projected/a6a5fa64-2376-48c5-9199-b25978c0cd0e-kube-api-access-q6chw\") on node \"crc\" DevicePath \"\"" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.821318 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pkp6p_must-gather-6s4pg_a6a5fa64-2376-48c5-9199-b25978c0cd0e/copy/0.log" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.821951 4735 generic.go:334] "Generic (PLEG): container finished" podID="a6a5fa64-2376-48c5-9199-b25978c0cd0e" containerID="b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38" exitCode=143 Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.821992 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pkp6p/must-gather-6s4pg" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.822016 4735 scope.go:117] "RemoveContainer" containerID="b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.857279 4735 scope.go:117] "RemoveContainer" containerID="bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.919913 4735 scope.go:117] "RemoveContainer" containerID="b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38" Oct 01 11:19:31 crc kubenswrapper[4735]: E1001 11:19:31.934679 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38\": container with ID starting with b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38 not found: ID does not exist" containerID="b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.934723 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38"} err="failed to get container status \"b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38\": rpc error: code = NotFound desc = could not find container \"b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38\": container with ID starting with b53ff69a76e675c806283e11cd9b4e413721af7845039daafdf1705b6a087c38 not found: ID does not exist" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.934752 4735 scope.go:117] "RemoveContainer" containerID="bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a" Oct 01 11:19:31 crc kubenswrapper[4735]: E1001 11:19:31.936971 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a\": container with ID starting with bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a not found: ID does not exist" containerID="bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.937003 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a"} err="failed to get container status \"bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a\": rpc error: code = NotFound desc = could not find container \"bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a\": container with ID starting with bddc77c4dbfd99e3f9d8a234ec6ca5919ca783f18cef07eedddc8822aadaf84a not found: ID does not exist" Oct 01 11:19:31 crc kubenswrapper[4735]: I1001 11:19:31.950038 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6a5fa64-2376-48c5-9199-b25978c0cd0e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a6a5fa64-2376-48c5-9199-b25978c0cd0e" (UID: "a6a5fa64-2376-48c5-9199-b25978c0cd0e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:19:32 crc kubenswrapper[4735]: I1001 11:19:32.024516 4735 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a6a5fa64-2376-48c5-9199-b25978c0cd0e-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 11:19:33 crc kubenswrapper[4735]: I1001 11:19:33.912232 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a5fa64-2376-48c5-9199-b25978c0cd0e" path="/var/lib/kubelet/pods/a6a5fa64-2376-48c5-9199-b25978c0cd0e/volumes" Oct 01 11:19:37 crc kubenswrapper[4735]: I1001 11:19:37.897404 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:19:37 crc kubenswrapper[4735]: E1001 11:19:37.898211 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:19:50 crc kubenswrapper[4735]: I1001 11:19:50.897913 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:19:50 crc kubenswrapper[4735]: E1001 11:19:50.900410 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:20:03 crc kubenswrapper[4735]: I1001 11:20:03.896832 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:20:03 crc kubenswrapper[4735]: E1001 11:20:03.898191 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.766267 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hr556/must-gather-c4kx7"] Oct 01 11:20:10 crc kubenswrapper[4735]: E1001 11:20:10.767215 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a5fa64-2376-48c5-9199-b25978c0cd0e" containerName="copy" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.767230 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a5fa64-2376-48c5-9199-b25978c0cd0e" containerName="copy" Oct 01 11:20:10 crc kubenswrapper[4735]: E1001 11:20:10.767249 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ecb72b6-6633-46dc-9547-f7c9f0064439" containerName="extract-utilities" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.767258 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ecb72b6-6633-46dc-9547-f7c9f0064439" containerName="extract-utilities" Oct 01 11:20:10 crc kubenswrapper[4735]: E1001 11:20:10.767281 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ecb72b6-6633-46dc-9547-f7c9f0064439" containerName="extract-content" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.767290 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ecb72b6-6633-46dc-9547-f7c9f0064439" containerName="extract-content" Oct 01 11:20:10 crc kubenswrapper[4735]: E1001 11:20:10.767323 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a5fa64-2376-48c5-9199-b25978c0cd0e" containerName="gather" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.767333 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a5fa64-2376-48c5-9199-b25978c0cd0e" containerName="gather" Oct 01 11:20:10 crc kubenswrapper[4735]: E1001 11:20:10.767350 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ecb72b6-6633-46dc-9547-f7c9f0064439" containerName="registry-server" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.767358 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ecb72b6-6633-46dc-9547-f7c9f0064439" containerName="registry-server" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.767597 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a5fa64-2376-48c5-9199-b25978c0cd0e" containerName="copy" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.767616 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ecb72b6-6633-46dc-9547-f7c9f0064439" containerName="registry-server" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.767651 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a5fa64-2376-48c5-9199-b25978c0cd0e" containerName="gather" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.768922 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/must-gather-c4kx7" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.770952 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hr556"/"kube-root-ca.crt" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.770952 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hr556"/"openshift-service-ca.crt" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.771029 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hr556"/"default-dockercfg-ttmjz" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.777132 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hr556/must-gather-c4kx7"] Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.957542 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/650ce69a-32a5-4788-aa5e-918fd264ecf7-must-gather-output\") pod \"must-gather-c4kx7\" (UID: \"650ce69a-32a5-4788-aa5e-918fd264ecf7\") " pod="openshift-must-gather-hr556/must-gather-c4kx7" Oct 01 11:20:10 crc kubenswrapper[4735]: I1001 11:20:10.957636 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b2qj\" (UniqueName: \"kubernetes.io/projected/650ce69a-32a5-4788-aa5e-918fd264ecf7-kube-api-access-5b2qj\") pod \"must-gather-c4kx7\" (UID: \"650ce69a-32a5-4788-aa5e-918fd264ecf7\") " pod="openshift-must-gather-hr556/must-gather-c4kx7" Oct 01 11:20:11 crc kubenswrapper[4735]: I1001 11:20:11.059714 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/650ce69a-32a5-4788-aa5e-918fd264ecf7-must-gather-output\") pod \"must-gather-c4kx7\" (UID: \"650ce69a-32a5-4788-aa5e-918fd264ecf7\") " pod="openshift-must-gather-hr556/must-gather-c4kx7" Oct 01 11:20:11 crc kubenswrapper[4735]: I1001 11:20:11.059766 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b2qj\" (UniqueName: \"kubernetes.io/projected/650ce69a-32a5-4788-aa5e-918fd264ecf7-kube-api-access-5b2qj\") pod \"must-gather-c4kx7\" (UID: \"650ce69a-32a5-4788-aa5e-918fd264ecf7\") " pod="openshift-must-gather-hr556/must-gather-c4kx7" Oct 01 11:20:11 crc kubenswrapper[4735]: I1001 11:20:11.060600 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/650ce69a-32a5-4788-aa5e-918fd264ecf7-must-gather-output\") pod \"must-gather-c4kx7\" (UID: \"650ce69a-32a5-4788-aa5e-918fd264ecf7\") " pod="openshift-must-gather-hr556/must-gather-c4kx7" Oct 01 11:20:11 crc kubenswrapper[4735]: I1001 11:20:11.082680 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b2qj\" (UniqueName: \"kubernetes.io/projected/650ce69a-32a5-4788-aa5e-918fd264ecf7-kube-api-access-5b2qj\") pod \"must-gather-c4kx7\" (UID: \"650ce69a-32a5-4788-aa5e-918fd264ecf7\") " pod="openshift-must-gather-hr556/must-gather-c4kx7" Oct 01 11:20:11 crc kubenswrapper[4735]: I1001 11:20:11.090446 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/must-gather-c4kx7" Oct 01 11:20:11 crc kubenswrapper[4735]: I1001 11:20:11.572878 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hr556/must-gather-c4kx7"] Oct 01 11:20:12 crc kubenswrapper[4735]: I1001 11:20:12.239188 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/must-gather-c4kx7" event={"ID":"650ce69a-32a5-4788-aa5e-918fd264ecf7","Type":"ContainerStarted","Data":"a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce"} Oct 01 11:20:12 crc kubenswrapper[4735]: I1001 11:20:12.239535 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/must-gather-c4kx7" event={"ID":"650ce69a-32a5-4788-aa5e-918fd264ecf7","Type":"ContainerStarted","Data":"c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224"} Oct 01 11:20:12 crc kubenswrapper[4735]: I1001 11:20:12.239559 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/must-gather-c4kx7" event={"ID":"650ce69a-32a5-4788-aa5e-918fd264ecf7","Type":"ContainerStarted","Data":"121f572d57ecec432022a3c12570706aaf4ed1d586f112149e9cc44d6b82daa7"} Oct 01 11:20:12 crc kubenswrapper[4735]: I1001 11:20:12.262066 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hr556/must-gather-c4kx7" podStartSLOduration=2.262031592 podStartE2EDuration="2.262031592s" podCreationTimestamp="2025-10-01 11:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:20:12.254334276 +0000 UTC m=+3770.947155538" watchObservedRunningTime="2025-10-01 11:20:12.262031592 +0000 UTC m=+3770.954852854" Oct 01 11:20:15 crc kubenswrapper[4735]: I1001 11:20:15.384072 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hr556/crc-debug-xgjrq"] Oct 01 11:20:15 crc kubenswrapper[4735]: I1001 11:20:15.386560 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-xgjrq" Oct 01 11:20:15 crc kubenswrapper[4735]: I1001 11:20:15.542612 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-host\") pod \"crc-debug-xgjrq\" (UID: \"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb\") " pod="openshift-must-gather-hr556/crc-debug-xgjrq" Oct 01 11:20:15 crc kubenswrapper[4735]: I1001 11:20:15.542733 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggclx\" (UniqueName: \"kubernetes.io/projected/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-kube-api-access-ggclx\") pod \"crc-debug-xgjrq\" (UID: \"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb\") " pod="openshift-must-gather-hr556/crc-debug-xgjrq" Oct 01 11:20:15 crc kubenswrapper[4735]: I1001 11:20:15.644465 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-host\") pod \"crc-debug-xgjrq\" (UID: \"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb\") " pod="openshift-must-gather-hr556/crc-debug-xgjrq" Oct 01 11:20:15 crc kubenswrapper[4735]: I1001 11:20:15.644585 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggclx\" (UniqueName: \"kubernetes.io/projected/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-kube-api-access-ggclx\") pod \"crc-debug-xgjrq\" (UID: \"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb\") " pod="openshift-must-gather-hr556/crc-debug-xgjrq" Oct 01 11:20:15 crc kubenswrapper[4735]: I1001 11:20:15.644714 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-host\") pod \"crc-debug-xgjrq\" (UID: \"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb\") " pod="openshift-must-gather-hr556/crc-debug-xgjrq" Oct 01 11:20:15 crc kubenswrapper[4735]: I1001 11:20:15.666768 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggclx\" (UniqueName: \"kubernetes.io/projected/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-kube-api-access-ggclx\") pod \"crc-debug-xgjrq\" (UID: \"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb\") " pod="openshift-must-gather-hr556/crc-debug-xgjrq" Oct 01 11:20:15 crc kubenswrapper[4735]: I1001 11:20:15.717860 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-xgjrq" Oct 01 11:20:16 crc kubenswrapper[4735]: I1001 11:20:16.288894 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/crc-debug-xgjrq" event={"ID":"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb","Type":"ContainerStarted","Data":"3e82b6be40849266afaafdf8afbe6435c888262bfe2300395e7e882abe03cf52"} Oct 01 11:20:16 crc kubenswrapper[4735]: I1001 11:20:16.289313 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/crc-debug-xgjrq" event={"ID":"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb","Type":"ContainerStarted","Data":"cdc49e4d06adfe05d1bf326fb939b858e511aa386210c8755c36b33c970fa516"} Oct 01 11:20:16 crc kubenswrapper[4735]: I1001 11:20:16.313744 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hr556/crc-debug-xgjrq" podStartSLOduration=1.313723571 podStartE2EDuration="1.313723571s" podCreationTimestamp="2025-10-01 11:20:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 11:20:16.302758146 +0000 UTC m=+3774.995579408" watchObservedRunningTime="2025-10-01 11:20:16.313723571 +0000 UTC m=+3775.006544833" Oct 01 11:20:17 crc kubenswrapper[4735]: I1001 11:20:17.898151 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:20:17 crc kubenswrapper[4735]: E1001 11:20:17.898925 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:20:27 crc kubenswrapper[4735]: I1001 11:20:27.946920 4735 scope.go:117] "RemoveContainer" containerID="24b50f722b52bb6c43cffa275d301c452ab565b7d5b9376663723761988fa0b8" Oct 01 11:20:32 crc kubenswrapper[4735]: I1001 11:20:32.896994 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:20:32 crc kubenswrapper[4735]: E1001 11:20:32.897840 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.725405 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hpmph"] Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.731614 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.747861 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpmph"] Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.826759 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-catalog-content\") pod \"certified-operators-hpmph\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.826825 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-utilities\") pod \"certified-operators-hpmph\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.827152 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg9n4\" (UniqueName: \"kubernetes.io/projected/ac4e46b5-023c-40d7-bc52-ee2182087e78-kube-api-access-xg9n4\") pod \"certified-operators-hpmph\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.929378 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-catalog-content\") pod \"certified-operators-hpmph\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.929425 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-utilities\") pod \"certified-operators-hpmph\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.929492 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg9n4\" (UniqueName: \"kubernetes.io/projected/ac4e46b5-023c-40d7-bc52-ee2182087e78-kube-api-access-xg9n4\") pod \"certified-operators-hpmph\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.929950 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-utilities\") pod \"certified-operators-hpmph\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.930064 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-catalog-content\") pod \"certified-operators-hpmph\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:36 crc kubenswrapper[4735]: I1001 11:20:36.955837 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg9n4\" (UniqueName: \"kubernetes.io/projected/ac4e46b5-023c-40d7-bc52-ee2182087e78-kube-api-access-xg9n4\") pod \"certified-operators-hpmph\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:37 crc kubenswrapper[4735]: I1001 11:20:37.069871 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:37 crc kubenswrapper[4735]: I1001 11:20:37.589074 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpmph"] Oct 01 11:20:38 crc kubenswrapper[4735]: I1001 11:20:38.488232 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac4e46b5-023c-40d7-bc52-ee2182087e78" containerID="2f6f2f6b0ad7f9728b09b23a07a41520bb8b875635fede2b5c3aabcdf8a7ee49" exitCode=0 Oct 01 11:20:38 crc kubenswrapper[4735]: I1001 11:20:38.488705 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpmph" event={"ID":"ac4e46b5-023c-40d7-bc52-ee2182087e78","Type":"ContainerDied","Data":"2f6f2f6b0ad7f9728b09b23a07a41520bb8b875635fede2b5c3aabcdf8a7ee49"} Oct 01 11:20:38 crc kubenswrapper[4735]: I1001 11:20:38.488734 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpmph" event={"ID":"ac4e46b5-023c-40d7-bc52-ee2182087e78","Type":"ContainerStarted","Data":"cb8a3ac274690a6ef3400b13701a7c056e1e6a612844830a9933e8ef3039a900"} Oct 01 11:20:39 crc kubenswrapper[4735]: I1001 11:20:39.497349 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpmph" event={"ID":"ac4e46b5-023c-40d7-bc52-ee2182087e78","Type":"ContainerStarted","Data":"6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b"} Oct 01 11:20:40 crc kubenswrapper[4735]: I1001 11:20:40.508054 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac4e46b5-023c-40d7-bc52-ee2182087e78" containerID="6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b" exitCode=0 Oct 01 11:20:40 crc kubenswrapper[4735]: I1001 11:20:40.508147 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpmph" event={"ID":"ac4e46b5-023c-40d7-bc52-ee2182087e78","Type":"ContainerDied","Data":"6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b"} Oct 01 11:20:41 crc kubenswrapper[4735]: I1001 11:20:41.523828 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpmph" event={"ID":"ac4e46b5-023c-40d7-bc52-ee2182087e78","Type":"ContainerStarted","Data":"83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f"} Oct 01 11:20:41 crc kubenswrapper[4735]: I1001 11:20:41.559470 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hpmph" podStartSLOduration=3.113177902 podStartE2EDuration="5.559448135s" podCreationTimestamp="2025-10-01 11:20:36 +0000 UTC" firstStartedPulling="2025-10-01 11:20:38.49041443 +0000 UTC m=+3797.183235692" lastFinishedPulling="2025-10-01 11:20:40.936684653 +0000 UTC m=+3799.629505925" observedRunningTime="2025-10-01 11:20:41.548355427 +0000 UTC m=+3800.241176699" watchObservedRunningTime="2025-10-01 11:20:41.559448135 +0000 UTC m=+3800.252269397" Oct 01 11:20:45 crc kubenswrapper[4735]: I1001 11:20:45.896992 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:20:45 crc kubenswrapper[4735]: E1001 11:20:45.897877 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:20:47 crc kubenswrapper[4735]: I1001 11:20:47.070187 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:47 crc kubenswrapper[4735]: I1001 11:20:47.071797 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:47 crc kubenswrapper[4735]: I1001 11:20:47.122519 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:47 crc kubenswrapper[4735]: I1001 11:20:47.645180 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:47 crc kubenswrapper[4735]: I1001 11:20:47.689437 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpmph"] Oct 01 11:20:49 crc kubenswrapper[4735]: I1001 11:20:49.607829 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hpmph" podUID="ac4e46b5-023c-40d7-bc52-ee2182087e78" containerName="registry-server" containerID="cri-o://83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f" gracePeriod=2 Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.089665 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.256055 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-catalog-content\") pod \"ac4e46b5-023c-40d7-bc52-ee2182087e78\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.256593 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-utilities\") pod \"ac4e46b5-023c-40d7-bc52-ee2182087e78\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.257031 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-utilities" (OuterVolumeSpecName: "utilities") pod "ac4e46b5-023c-40d7-bc52-ee2182087e78" (UID: "ac4e46b5-023c-40d7-bc52-ee2182087e78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.257190 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg9n4\" (UniqueName: \"kubernetes.io/projected/ac4e46b5-023c-40d7-bc52-ee2182087e78-kube-api-access-xg9n4\") pod \"ac4e46b5-023c-40d7-bc52-ee2182087e78\" (UID: \"ac4e46b5-023c-40d7-bc52-ee2182087e78\") " Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.258306 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.265709 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac4e46b5-023c-40d7-bc52-ee2182087e78-kube-api-access-xg9n4" (OuterVolumeSpecName: "kube-api-access-xg9n4") pod "ac4e46b5-023c-40d7-bc52-ee2182087e78" (UID: "ac4e46b5-023c-40d7-bc52-ee2182087e78"). InnerVolumeSpecName "kube-api-access-xg9n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.295444 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac4e46b5-023c-40d7-bc52-ee2182087e78" (UID: "ac4e46b5-023c-40d7-bc52-ee2182087e78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.360269 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg9n4\" (UniqueName: \"kubernetes.io/projected/ac4e46b5-023c-40d7-bc52-ee2182087e78-kube-api-access-xg9n4\") on node \"crc\" DevicePath \"\"" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.360303 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4e46b5-023c-40d7-bc52-ee2182087e78-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.617824 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac4e46b5-023c-40d7-bc52-ee2182087e78" containerID="83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f" exitCode=0 Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.617865 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpmph" event={"ID":"ac4e46b5-023c-40d7-bc52-ee2182087e78","Type":"ContainerDied","Data":"83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f"} Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.617896 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpmph" event={"ID":"ac4e46b5-023c-40d7-bc52-ee2182087e78","Type":"ContainerDied","Data":"cb8a3ac274690a6ef3400b13701a7c056e1e6a612844830a9933e8ef3039a900"} Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.617905 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpmph" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.617914 4735 scope.go:117] "RemoveContainer" containerID="83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.644068 4735 scope.go:117] "RemoveContainer" containerID="6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.659087 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpmph"] Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.667056 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hpmph"] Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.675322 4735 scope.go:117] "RemoveContainer" containerID="2f6f2f6b0ad7f9728b09b23a07a41520bb8b875635fede2b5c3aabcdf8a7ee49" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.716726 4735 scope.go:117] "RemoveContainer" containerID="83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f" Oct 01 11:20:50 crc kubenswrapper[4735]: E1001 11:20:50.717138 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f\": container with ID starting with 83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f not found: ID does not exist" containerID="83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.717181 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f"} err="failed to get container status \"83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f\": rpc error: code = NotFound desc = could not find container \"83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f\": container with ID starting with 83e7f735b3707dc13473af2be4a67f5f09ebfcfa2af5a71441599b6653e8ec7f not found: ID does not exist" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.717207 4735 scope.go:117] "RemoveContainer" containerID="6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b" Oct 01 11:20:50 crc kubenswrapper[4735]: E1001 11:20:50.717620 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b\": container with ID starting with 6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b not found: ID does not exist" containerID="6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.717650 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b"} err="failed to get container status \"6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b\": rpc error: code = NotFound desc = could not find container \"6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b\": container with ID starting with 6103b881cddb8a0d7a747ea4f7b2ae75930009dd51c649b42cfa0104c623163b not found: ID does not exist" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.717674 4735 scope.go:117] "RemoveContainer" containerID="2f6f2f6b0ad7f9728b09b23a07a41520bb8b875635fede2b5c3aabcdf8a7ee49" Oct 01 11:20:50 crc kubenswrapper[4735]: E1001 11:20:50.717851 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f6f2f6b0ad7f9728b09b23a07a41520bb8b875635fede2b5c3aabcdf8a7ee49\": container with ID starting with 2f6f2f6b0ad7f9728b09b23a07a41520bb8b875635fede2b5c3aabcdf8a7ee49 not found: ID does not exist" containerID="2f6f2f6b0ad7f9728b09b23a07a41520bb8b875635fede2b5c3aabcdf8a7ee49" Oct 01 11:20:50 crc kubenswrapper[4735]: I1001 11:20:50.717874 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6f2f6b0ad7f9728b09b23a07a41520bb8b875635fede2b5c3aabcdf8a7ee49"} err="failed to get container status \"2f6f2f6b0ad7f9728b09b23a07a41520bb8b875635fede2b5c3aabcdf8a7ee49\": rpc error: code = NotFound desc = could not find container \"2f6f2f6b0ad7f9728b09b23a07a41520bb8b875635fede2b5c3aabcdf8a7ee49\": container with ID starting with 2f6f2f6b0ad7f9728b09b23a07a41520bb8b875635fede2b5c3aabcdf8a7ee49 not found: ID does not exist" Oct 01 11:20:51 crc kubenswrapper[4735]: I1001 11:20:51.912295 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac4e46b5-023c-40d7-bc52-ee2182087e78" path="/var/lib/kubelet/pods/ac4e46b5-023c-40d7-bc52-ee2182087e78/volumes" Oct 01 11:20:59 crc kubenswrapper[4735]: I1001 11:20:59.898334 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:20:59 crc kubenswrapper[4735]: E1001 11:20:59.899108 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:21:11 crc kubenswrapper[4735]: I1001 11:21:11.902416 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:21:11 crc kubenswrapper[4735]: E1001 11:21:11.903779 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:21:13 crc kubenswrapper[4735]: I1001 11:21:13.675640 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7ddbc654c4-6bddt_e4554523-3ac0-4f1d-8a5d-f8892d72229d/barbican-api-log/0.log" Oct 01 11:21:13 crc kubenswrapper[4735]: I1001 11:21:13.721290 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7ddbc654c4-6bddt_e4554523-3ac0-4f1d-8a5d-f8892d72229d/barbican-api/0.log" Oct 01 11:21:13 crc kubenswrapper[4735]: I1001 11:21:13.868759 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-599c96b4d8-46mbs_4ce28744-bfa2-4674-a495-02abf8245d38/barbican-keystone-listener/0.log" Oct 01 11:21:13 crc kubenswrapper[4735]: I1001 11:21:13.942135 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-599c96b4d8-46mbs_4ce28744-bfa2-4674-a495-02abf8245d38/barbican-keystone-listener-log/0.log" Oct 01 11:21:14 crc kubenswrapper[4735]: I1001 11:21:14.041206 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55fd7c674f-kfd9n_39f45bdc-e25e-41f8-aefe-0dede18c4bb8/barbican-worker/0.log" Oct 01 11:21:14 crc kubenswrapper[4735]: I1001 11:21:14.136459 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55fd7c674f-kfd9n_39f45bdc-e25e-41f8-aefe-0dede18c4bb8/barbican-worker-log/0.log" Oct 01 11:21:14 crc kubenswrapper[4735]: I1001 11:21:14.253183 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-b28nq_ffacc50e-734b-4e8a-ac0c-a33197ce2351/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:14 crc kubenswrapper[4735]: I1001 11:21:14.428852 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_181bb1ae-a923-4bbd-8d8f-e5d8c8878214/ceilometer-central-agent/0.log" Oct 01 11:21:14 crc kubenswrapper[4735]: I1001 11:21:14.455601 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_181bb1ae-a923-4bbd-8d8f-e5d8c8878214/ceilometer-notification-agent/0.log" Oct 01 11:21:14 crc kubenswrapper[4735]: I1001 11:21:14.534271 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_181bb1ae-a923-4bbd-8d8f-e5d8c8878214/proxy-httpd/0.log" Oct 01 11:21:14 crc kubenswrapper[4735]: I1001 11:21:14.579621 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_181bb1ae-a923-4bbd-8d8f-e5d8c8878214/sg-core/0.log" Oct 01 11:21:14 crc kubenswrapper[4735]: I1001 11:21:14.730413 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a00c66d0-b381-43e2-ae45-8635ff4f424e/cinder-api/0.log" Oct 01 11:21:14 crc kubenswrapper[4735]: I1001 11:21:14.791657 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a00c66d0-b381-43e2-ae45-8635ff4f424e/cinder-api-log/0.log" Oct 01 11:21:14 crc kubenswrapper[4735]: I1001 11:21:14.950370 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c7925532-6ff1-4914-84b0-8206d0ad5225/cinder-scheduler/0.log" Oct 01 11:21:15 crc kubenswrapper[4735]: I1001 11:21:15.007958 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c7925532-6ff1-4914-84b0-8206d0ad5225/probe/0.log" Oct 01 11:21:15 crc kubenswrapper[4735]: I1001 11:21:15.162031 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-4hx9b_697ab5a4-56ef-4755-9211-fcd52866c939/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:15 crc kubenswrapper[4735]: I1001 11:21:15.275090 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qlbrt_36eb08d7-ab2a-4a0b-8a50-2511b19b3e3c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:15 crc kubenswrapper[4735]: I1001 11:21:15.401251 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-svdrq_5b381c11-584e-4a17-b4a4-cd150f2d3d82/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:15 crc kubenswrapper[4735]: I1001 11:21:15.573539 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-4vfgs_9b994d24-224b-42cf-8516-044c561a5f4e/init/0.log" Oct 01 11:21:15 crc kubenswrapper[4735]: I1001 11:21:15.729301 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-4vfgs_9b994d24-224b-42cf-8516-044c561a5f4e/init/0.log" Oct 01 11:21:15 crc kubenswrapper[4735]: I1001 11:21:15.760402 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-4vfgs_9b994d24-224b-42cf-8516-044c561a5f4e/dnsmasq-dns/0.log" Oct 01 11:21:15 crc kubenswrapper[4735]: I1001 11:21:15.934796 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pq2m5_31970fe6-5cef-41cc-8799-c4e9de559f23/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:16 crc kubenswrapper[4735]: I1001 11:21:16.065184 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_53b4ae38-a993-4a83-93bf-da796d4be856/glance-httpd/0.log" Oct 01 11:21:16 crc kubenswrapper[4735]: I1001 11:21:16.216578 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_53b4ae38-a993-4a83-93bf-da796d4be856/glance-log/0.log" Oct 01 11:21:16 crc kubenswrapper[4735]: I1001 11:21:16.301951 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_971696de-af05-4d40-89d1-64a2688b08e0/glance-httpd/0.log" Oct 01 11:21:16 crc kubenswrapper[4735]: I1001 11:21:16.408404 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_971696de-af05-4d40-89d1-64a2688b08e0/glance-log/0.log" Oct 01 11:21:16 crc kubenswrapper[4735]: I1001 11:21:16.552282 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7746dbdbf6-t6f7n_7353c4ca-59bc-4a50-8840-8365f90f6384/horizon/0.log" Oct 01 11:21:16 crc kubenswrapper[4735]: I1001 11:21:16.712475 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-d6n8z_77d6c2ca-cb77-4582-b644-d077086a29b5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:16 crc kubenswrapper[4735]: I1001 11:21:16.932685 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-cd2rs_569a20cc-087a-4c93-b23b-af5c6b209b80/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:16 crc kubenswrapper[4735]: I1001 11:21:16.962381 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7746dbdbf6-t6f7n_7353c4ca-59bc-4a50-8840-8365f90f6384/horizon-log/0.log" Oct 01 11:21:17 crc kubenswrapper[4735]: I1001 11:21:17.118910 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29321941-p4wv8_99c1d24d-af2e-4093-bd99-d1c1cdabd8be/keystone-cron/0.log" Oct 01 11:21:17 crc kubenswrapper[4735]: I1001 11:21:17.138978 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7888d7549b-dbpkr_98103d81-4a3c-4c99-9d51-f73f8e5fd295/keystone-api/0.log" Oct 01 11:21:17 crc kubenswrapper[4735]: I1001 11:21:17.286113 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f9f14b82-f708-409b-94cb-34b6863dc8cc/kube-state-metrics/0.log" Oct 01 11:21:17 crc kubenswrapper[4735]: I1001 11:21:17.338058 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-tvzvs_98e6a81a-a5bd-49eb-9d2b-a24ba5e336b7/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:17 crc kubenswrapper[4735]: I1001 11:21:17.702145 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-798b4f9b87-frx5r_d235163d-548f-40e3-9aae-490a41523da2/neutron-httpd/0.log" Oct 01 11:21:17 crc kubenswrapper[4735]: I1001 11:21:17.726983 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-798b4f9b87-frx5r_d235163d-548f-40e3-9aae-490a41523da2/neutron-api/0.log" Oct 01 11:21:17 crc kubenswrapper[4735]: I1001 11:21:17.923948 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-tk9hw_9d75fc81-9126-4b8e-b623-47a8c65adb8f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:18 crc kubenswrapper[4735]: I1001 11:21:18.398873 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_62b96f55-0807-4922-acaf-a84037e549ff/nova-api-log/0.log" Oct 01 11:21:18 crc kubenswrapper[4735]: I1001 11:21:18.619573 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_62b96f55-0807-4922-acaf-a84037e549ff/nova-api-api/0.log" Oct 01 11:21:18 crc kubenswrapper[4735]: I1001 11:21:18.657139 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_61a477e3-4a22-4c96-bfdb-c72c65d4984c/nova-cell0-conductor-conductor/0.log" Oct 01 11:21:18 crc kubenswrapper[4735]: I1001 11:21:18.983268 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_bf5a2979-222f-459d-9c57-599ebc27167e/nova-cell1-conductor-conductor/0.log" Oct 01 11:21:19 crc kubenswrapper[4735]: I1001 11:21:19.049626 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8639e0ae-f968-4b8f-b73d-52c2aba0ad24/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 11:21:19 crc kubenswrapper[4735]: I1001 11:21:19.195437 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-m7tr8_83863343-c31f-484c-9e44-3e6ed41988d8/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:19 crc kubenswrapper[4735]: I1001 11:21:19.354598 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_79b8eab7-e3a4-4194-852d-1f1b91155a7d/nova-metadata-log/0.log" Oct 01 11:21:19 crc kubenswrapper[4735]: I1001 11:21:19.806977 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6ce10bc0-35a7-49e7-b138-196478a093d0/nova-scheduler-scheduler/0.log" Oct 01 11:21:19 crc kubenswrapper[4735]: I1001 11:21:19.823531 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ec099172-9672-4553-94cd-c430818da51d/mysql-bootstrap/0.log" Oct 01 11:21:20 crc kubenswrapper[4735]: I1001 11:21:20.022249 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ec099172-9672-4553-94cd-c430818da51d/galera/0.log" Oct 01 11:21:20 crc kubenswrapper[4735]: I1001 11:21:20.029587 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ec099172-9672-4553-94cd-c430818da51d/mysql-bootstrap/0.log" Oct 01 11:21:20 crc kubenswrapper[4735]: I1001 11:21:20.236920 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_15f57822-7418-47e6-b679-aea87612b3ec/mysql-bootstrap/0.log" Oct 01 11:21:20 crc kubenswrapper[4735]: I1001 11:21:20.445733 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_15f57822-7418-47e6-b679-aea87612b3ec/mysql-bootstrap/0.log" Oct 01 11:21:20 crc kubenswrapper[4735]: I1001 11:21:20.479350 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_15f57822-7418-47e6-b679-aea87612b3ec/galera/0.log" Oct 01 11:21:20 crc kubenswrapper[4735]: I1001 11:21:20.671329 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c4b138e5-c197-4f7d-b0c3-aed5f8dd07e5/openstackclient/0.log" Oct 01 11:21:20 crc kubenswrapper[4735]: I1001 11:21:20.804977 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_79b8eab7-e3a4-4194-852d-1f1b91155a7d/nova-metadata-metadata/0.log" Oct 01 11:21:20 crc kubenswrapper[4735]: I1001 11:21:20.870136 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4vkk2_83a05602-9f43-41cf-af06-9e6d2109e6c9/openstack-network-exporter/0.log" Oct 01 11:21:21 crc kubenswrapper[4735]: I1001 11:21:21.040219 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4k7wp_43ce7edf-2010-4ccf-ac60-b26606130624/ovsdb-server-init/0.log" Oct 01 11:21:21 crc kubenswrapper[4735]: I1001 11:21:21.219204 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4k7wp_43ce7edf-2010-4ccf-ac60-b26606130624/ovsdb-server/0.log" Oct 01 11:21:21 crc kubenswrapper[4735]: I1001 11:21:21.225617 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4k7wp_43ce7edf-2010-4ccf-ac60-b26606130624/ovs-vswitchd/0.log" Oct 01 11:21:21 crc kubenswrapper[4735]: I1001 11:21:21.227965 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4k7wp_43ce7edf-2010-4ccf-ac60-b26606130624/ovsdb-server-init/0.log" Oct 01 11:21:21 crc kubenswrapper[4735]: I1001 11:21:21.442443 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zmn7b_3bcc7869-f6b2-4c99-adde-40577b12c99d/ovn-controller/0.log" Oct 01 11:21:21 crc kubenswrapper[4735]: I1001 11:21:21.675456 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_356e4644-7f54-4b34-b72a-510958be19e5/openstack-network-exporter/0.log" Oct 01 11:21:21 crc kubenswrapper[4735]: I1001 11:21:21.688585 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-srqk9_d5532968-8896-44ba-a120-62bacb3bf10a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:21 crc kubenswrapper[4735]: I1001 11:21:21.839985 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_356e4644-7f54-4b34-b72a-510958be19e5/ovn-northd/0.log" Oct 01 11:21:21 crc kubenswrapper[4735]: I1001 11:21:21.937173 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_33a18104-7c91-4314-99e8-37396ef7c259/openstack-network-exporter/0.log" Oct 01 11:21:22 crc kubenswrapper[4735]: I1001 11:21:22.056618 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_33a18104-7c91-4314-99e8-37396ef7c259/ovsdbserver-nb/0.log" Oct 01 11:21:22 crc kubenswrapper[4735]: I1001 11:21:22.141634 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_aa211c9f-0ea4-4b46-9f37-c5917dd0d833/openstack-network-exporter/0.log" Oct 01 11:21:22 crc kubenswrapper[4735]: I1001 11:21:22.244391 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_aa211c9f-0ea4-4b46-9f37-c5917dd0d833/ovsdbserver-sb/0.log" Oct 01 11:21:22 crc kubenswrapper[4735]: I1001 11:21:22.399597 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85c495dfd8-k8nc5_5542515c-c850-46f5-875b-65c55c28cbdc/placement-api/0.log" Oct 01 11:21:22 crc kubenswrapper[4735]: I1001 11:21:22.539381 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85c495dfd8-k8nc5_5542515c-c850-46f5-875b-65c55c28cbdc/placement-log/0.log" Oct 01 11:21:22 crc kubenswrapper[4735]: I1001 11:21:22.612245 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cc51941b-ed03-480a-a90b-ba40dec75a6c/setup-container/0.log" Oct 01 11:21:22 crc kubenswrapper[4735]: I1001 11:21:22.906406 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cc51941b-ed03-480a-a90b-ba40dec75a6c/setup-container/0.log" Oct 01 11:21:22 crc kubenswrapper[4735]: I1001 11:21:22.953921 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cc51941b-ed03-480a-a90b-ba40dec75a6c/rabbitmq/0.log" Oct 01 11:21:23 crc kubenswrapper[4735]: I1001 11:21:23.095981 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_48a5abfa-1c13-4130-8cad-1596c95ef581/setup-container/0.log" Oct 01 11:21:23 crc kubenswrapper[4735]: I1001 11:21:23.332703 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_48a5abfa-1c13-4130-8cad-1596c95ef581/rabbitmq/0.log" Oct 01 11:21:23 crc kubenswrapper[4735]: I1001 11:21:23.360463 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_48a5abfa-1c13-4130-8cad-1596c95ef581/setup-container/0.log" Oct 01 11:21:23 crc kubenswrapper[4735]: I1001 11:21:23.508919 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-chlk8_fa85eebe-cbdf-41f6-b47b-5e844222f3fe/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:23 crc kubenswrapper[4735]: I1001 11:21:23.624311 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ldfrh_f3b0231f-c3ff-46dd-869c-c36d62466f45/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:23 crc kubenswrapper[4735]: I1001 11:21:23.803981 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-s7drs_7847fa7b-0680-48a0-bbba-2adf6b14fcec/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:23 crc kubenswrapper[4735]: I1001 11:21:23.972955 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-c8mbg_7223ea14-1a97-4c77-bbab-5f2919606539/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:24 crc kubenswrapper[4735]: I1001 11:21:24.168448 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dpr46_57690d02-83f0-47e6-a66f-da0ab4138820/ssh-known-hosts-edpm-deployment/0.log" Oct 01 11:21:24 crc kubenswrapper[4735]: I1001 11:21:24.370367 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-696cd688cf-kqrbf_95643272-0db0-4c04-9087-98321b57c893/proxy-server/0.log" Oct 01 11:21:24 crc kubenswrapper[4735]: I1001 11:21:24.381925 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-696cd688cf-kqrbf_95643272-0db0-4c04-9087-98321b57c893/proxy-httpd/0.log" Oct 01 11:21:24 crc kubenswrapper[4735]: I1001 11:21:24.562142 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-8gpx2_b5dd8db9-b427-4510-b6ab-82883a128fa2/swift-ring-rebalance/0.log" Oct 01 11:21:24 crc kubenswrapper[4735]: I1001 11:21:24.668281 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/account-auditor/0.log" Oct 01 11:21:24 crc kubenswrapper[4735]: I1001 11:21:24.740567 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/account-reaper/0.log" Oct 01 11:21:24 crc kubenswrapper[4735]: I1001 11:21:24.860099 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/account-replicator/0.log" Oct 01 11:21:24 crc kubenswrapper[4735]: I1001 11:21:24.861366 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/account-server/0.log" Oct 01 11:21:24 crc kubenswrapper[4735]: I1001 11:21:24.945219 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/container-auditor/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.087785 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/container-server/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.089864 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/container-replicator/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.174922 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/container-updater/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.302006 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/object-expirer/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.309049 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/object-auditor/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.367765 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/object-replicator/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.486053 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/object-updater/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.496290 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/object-server/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.539227 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/rsync/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.656076 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_651abe6c-1b2e-4652-a985-74f6cf2c7e17/swift-recon-cron/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.813912 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-6nbvq_061a2955-62b7-47d5-b62c-abb147006933/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:25 crc kubenswrapper[4735]: I1001 11:21:25.971351 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_66ee97cc-ac56-4879-b605-e2a9347213ca/tempest-tests-tempest-tests-runner/0.log" Oct 01 11:21:26 crc kubenswrapper[4735]: I1001 11:21:26.129717 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6001430a-4111-47f2-ba18-ee2e2661bb7c/test-operator-logs-container/0.log" Oct 01 11:21:26 crc kubenswrapper[4735]: I1001 11:21:26.256266 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-t8gc9_02bd9618-e194-4d1b-98f5-90ab53e53e39/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 11:21:26 crc kubenswrapper[4735]: I1001 11:21:26.897664 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:21:26 crc kubenswrapper[4735]: E1001 11:21:26.898066 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:21:36 crc kubenswrapper[4735]: I1001 11:21:36.150684 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_35686057-f8f4-4ef2-8a22-b9de9c15c9e5/memcached/0.log" Oct 01 11:21:39 crc kubenswrapper[4735]: I1001 11:21:39.897523 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:21:39 crc kubenswrapper[4735]: E1001 11:21:39.898292 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:21:51 crc kubenswrapper[4735]: I1001 11:21:51.919698 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:21:51 crc kubenswrapper[4735]: E1001 11:21:51.920929 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:21:56 crc kubenswrapper[4735]: I1001 11:21:56.222953 4735 generic.go:334] "Generic (PLEG): container finished" podID="b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb" containerID="3e82b6be40849266afaafdf8afbe6435c888262bfe2300395e7e882abe03cf52" exitCode=0 Oct 01 11:21:56 crc kubenswrapper[4735]: I1001 11:21:56.223071 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/crc-debug-xgjrq" event={"ID":"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb","Type":"ContainerDied","Data":"3e82b6be40849266afaafdf8afbe6435c888262bfe2300395e7e882abe03cf52"} Oct 01 11:21:57 crc kubenswrapper[4735]: I1001 11:21:57.371720 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-xgjrq" Oct 01 11:21:57 crc kubenswrapper[4735]: I1001 11:21:57.424345 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hr556/crc-debug-xgjrq"] Oct 01 11:21:57 crc kubenswrapper[4735]: I1001 11:21:57.435985 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hr556/crc-debug-xgjrq"] Oct 01 11:21:57 crc kubenswrapper[4735]: I1001 11:21:57.522124 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggclx\" (UniqueName: \"kubernetes.io/projected/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-kube-api-access-ggclx\") pod \"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb\" (UID: \"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb\") " Oct 01 11:21:57 crc kubenswrapper[4735]: I1001 11:21:57.522392 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-host\") pod \"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb\" (UID: \"b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb\") " Oct 01 11:21:57 crc kubenswrapper[4735]: I1001 11:21:57.522703 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-host" (OuterVolumeSpecName: "host") pod "b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb" (UID: "b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:21:57 crc kubenswrapper[4735]: I1001 11:21:57.523055 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-host\") on node \"crc\" DevicePath \"\"" Oct 01 11:21:57 crc kubenswrapper[4735]: I1001 11:21:57.530145 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-kube-api-access-ggclx" (OuterVolumeSpecName: "kube-api-access-ggclx") pod "b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb" (UID: "b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb"). InnerVolumeSpecName "kube-api-access-ggclx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:21:57 crc kubenswrapper[4735]: I1001 11:21:57.625089 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggclx\" (UniqueName: \"kubernetes.io/projected/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb-kube-api-access-ggclx\") on node \"crc\" DevicePath \"\"" Oct 01 11:21:57 crc kubenswrapper[4735]: I1001 11:21:57.910401 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb" path="/var/lib/kubelet/pods/b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb/volumes" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.247674 4735 scope.go:117] "RemoveContainer" containerID="3e82b6be40849266afaafdf8afbe6435c888262bfe2300395e7e882abe03cf52" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.247797 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-xgjrq" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.628911 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hr556/crc-debug-hffp8"] Oct 01 11:21:58 crc kubenswrapper[4735]: E1001 11:21:58.629273 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4e46b5-023c-40d7-bc52-ee2182087e78" containerName="extract-content" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.629286 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4e46b5-023c-40d7-bc52-ee2182087e78" containerName="extract-content" Oct 01 11:21:58 crc kubenswrapper[4735]: E1001 11:21:58.629306 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4e46b5-023c-40d7-bc52-ee2182087e78" containerName="registry-server" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.629314 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4e46b5-023c-40d7-bc52-ee2182087e78" containerName="registry-server" Oct 01 11:21:58 crc kubenswrapper[4735]: E1001 11:21:58.629330 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4e46b5-023c-40d7-bc52-ee2182087e78" containerName="extract-utilities" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.629336 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4e46b5-023c-40d7-bc52-ee2182087e78" containerName="extract-utilities" Oct 01 11:21:58 crc kubenswrapper[4735]: E1001 11:21:58.629352 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb" containerName="container-00" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.629358 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb" containerName="container-00" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.629556 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac4e46b5-023c-40d7-bc52-ee2182087e78" containerName="registry-server" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.629569 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83b63a5-b37b-4b00-bb1f-3f7fbebfb5bb" containerName="container-00" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.630160 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-hffp8" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.747284 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c452854b-71a8-4aad-baee-9c4e59b2d082-host\") pod \"crc-debug-hffp8\" (UID: \"c452854b-71a8-4aad-baee-9c4e59b2d082\") " pod="openshift-must-gather-hr556/crc-debug-hffp8" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.747638 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlsn5\" (UniqueName: \"kubernetes.io/projected/c452854b-71a8-4aad-baee-9c4e59b2d082-kube-api-access-xlsn5\") pod \"crc-debug-hffp8\" (UID: \"c452854b-71a8-4aad-baee-9c4e59b2d082\") " pod="openshift-must-gather-hr556/crc-debug-hffp8" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.849123 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c452854b-71a8-4aad-baee-9c4e59b2d082-host\") pod \"crc-debug-hffp8\" (UID: \"c452854b-71a8-4aad-baee-9c4e59b2d082\") " pod="openshift-must-gather-hr556/crc-debug-hffp8" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.849211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlsn5\" (UniqueName: \"kubernetes.io/projected/c452854b-71a8-4aad-baee-9c4e59b2d082-kube-api-access-xlsn5\") pod \"crc-debug-hffp8\" (UID: \"c452854b-71a8-4aad-baee-9c4e59b2d082\") " pod="openshift-must-gather-hr556/crc-debug-hffp8" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.849327 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c452854b-71a8-4aad-baee-9c4e59b2d082-host\") pod \"crc-debug-hffp8\" (UID: \"c452854b-71a8-4aad-baee-9c4e59b2d082\") " pod="openshift-must-gather-hr556/crc-debug-hffp8" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.879735 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlsn5\" (UniqueName: \"kubernetes.io/projected/c452854b-71a8-4aad-baee-9c4e59b2d082-kube-api-access-xlsn5\") pod \"crc-debug-hffp8\" (UID: \"c452854b-71a8-4aad-baee-9c4e59b2d082\") " pod="openshift-must-gather-hr556/crc-debug-hffp8" Oct 01 11:21:58 crc kubenswrapper[4735]: I1001 11:21:58.948074 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-hffp8" Oct 01 11:21:59 crc kubenswrapper[4735]: I1001 11:21:59.260463 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/crc-debug-hffp8" event={"ID":"c452854b-71a8-4aad-baee-9c4e59b2d082","Type":"ContainerStarted","Data":"90e1c6c7fc968dc25ae747f94f34998153faf92b61dac578f0ac45e36c66d85a"} Oct 01 11:22:00 crc kubenswrapper[4735]: I1001 11:22:00.277428 4735 generic.go:334] "Generic (PLEG): container finished" podID="c452854b-71a8-4aad-baee-9c4e59b2d082" containerID="de3bf936d41e813d70fd8f2fd90d13f998440630c65d310a1a1ff810672bf188" exitCode=0 Oct 01 11:22:00 crc kubenswrapper[4735]: I1001 11:22:00.277548 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/crc-debug-hffp8" event={"ID":"c452854b-71a8-4aad-baee-9c4e59b2d082","Type":"ContainerDied","Data":"de3bf936d41e813d70fd8f2fd90d13f998440630c65d310a1a1ff810672bf188"} Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.184001 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b9fp2"] Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.186298 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.194033 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9fp2"] Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.297080 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-utilities\") pod \"community-operators-b9fp2\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.297171 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-catalog-content\") pod \"community-operators-b9fp2\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.297291 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hhqd\" (UniqueName: \"kubernetes.io/projected/08457cf6-869f-4289-add8-7ee5683b189c-kube-api-access-4hhqd\") pod \"community-operators-b9fp2\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.398281 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hhqd\" (UniqueName: \"kubernetes.io/projected/08457cf6-869f-4289-add8-7ee5683b189c-kube-api-access-4hhqd\") pod \"community-operators-b9fp2\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.398346 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-utilities\") pod \"community-operators-b9fp2\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.398392 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-catalog-content\") pod \"community-operators-b9fp2\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.398811 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-catalog-content\") pod \"community-operators-b9fp2\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.399360 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-utilities\") pod \"community-operators-b9fp2\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.410998 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-hffp8" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.423751 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hhqd\" (UniqueName: \"kubernetes.io/projected/08457cf6-869f-4289-add8-7ee5683b189c-kube-api-access-4hhqd\") pod \"community-operators-b9fp2\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.499434 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c452854b-71a8-4aad-baee-9c4e59b2d082-host\") pod \"c452854b-71a8-4aad-baee-9c4e59b2d082\" (UID: \"c452854b-71a8-4aad-baee-9c4e59b2d082\") " Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.499958 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlsn5\" (UniqueName: \"kubernetes.io/projected/c452854b-71a8-4aad-baee-9c4e59b2d082-kube-api-access-xlsn5\") pod \"c452854b-71a8-4aad-baee-9c4e59b2d082\" (UID: \"c452854b-71a8-4aad-baee-9c4e59b2d082\") " Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.499654 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c452854b-71a8-4aad-baee-9c4e59b2d082-host" (OuterVolumeSpecName: "host") pod "c452854b-71a8-4aad-baee-9c4e59b2d082" (UID: "c452854b-71a8-4aad-baee-9c4e59b2d082"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.500571 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c452854b-71a8-4aad-baee-9c4e59b2d082-host\") on node \"crc\" DevicePath \"\"" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.505988 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c452854b-71a8-4aad-baee-9c4e59b2d082-kube-api-access-xlsn5" (OuterVolumeSpecName: "kube-api-access-xlsn5") pod "c452854b-71a8-4aad-baee-9c4e59b2d082" (UID: "c452854b-71a8-4aad-baee-9c4e59b2d082"). InnerVolumeSpecName "kube-api-access-xlsn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.573840 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:01 crc kubenswrapper[4735]: I1001 11:22:01.602545 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlsn5\" (UniqueName: \"kubernetes.io/projected/c452854b-71a8-4aad-baee-9c4e59b2d082-kube-api-access-xlsn5\") on node \"crc\" DevicePath \"\"" Oct 01 11:22:02 crc kubenswrapper[4735]: I1001 11:22:02.066550 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9fp2"] Oct 01 11:22:02 crc kubenswrapper[4735]: W1001 11:22:02.067567 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08457cf6_869f_4289_add8_7ee5683b189c.slice/crio-9ca8506a1ef520011bbf85e8f085299b8c7f51949b2982d8fbb8c1ab8a85fec3 WatchSource:0}: Error finding container 9ca8506a1ef520011bbf85e8f085299b8c7f51949b2982d8fbb8c1ab8a85fec3: Status 404 returned error can't find the container with id 9ca8506a1ef520011bbf85e8f085299b8c7f51949b2982d8fbb8c1ab8a85fec3 Oct 01 11:22:02 crc kubenswrapper[4735]: I1001 11:22:02.308935 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/crc-debug-hffp8" event={"ID":"c452854b-71a8-4aad-baee-9c4e59b2d082","Type":"ContainerDied","Data":"90e1c6c7fc968dc25ae747f94f34998153faf92b61dac578f0ac45e36c66d85a"} Oct 01 11:22:02 crc kubenswrapper[4735]: I1001 11:22:02.309280 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90e1c6c7fc968dc25ae747f94f34998153faf92b61dac578f0ac45e36c66d85a" Oct 01 11:22:02 crc kubenswrapper[4735]: I1001 11:22:02.308978 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-hffp8" Oct 01 11:22:02 crc kubenswrapper[4735]: I1001 11:22:02.310418 4735 generic.go:334] "Generic (PLEG): container finished" podID="08457cf6-869f-4289-add8-7ee5683b189c" containerID="659484e1b05661ce1baedbe4d07805e74a05f4b4bea55955d4852655c4867881" exitCode=0 Oct 01 11:22:02 crc kubenswrapper[4735]: I1001 11:22:02.310444 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9fp2" event={"ID":"08457cf6-869f-4289-add8-7ee5683b189c","Type":"ContainerDied","Data":"659484e1b05661ce1baedbe4d07805e74a05f4b4bea55955d4852655c4867881"} Oct 01 11:22:02 crc kubenswrapper[4735]: I1001 11:22:02.310460 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9fp2" event={"ID":"08457cf6-869f-4289-add8-7ee5683b189c","Type":"ContainerStarted","Data":"9ca8506a1ef520011bbf85e8f085299b8c7f51949b2982d8fbb8c1ab8a85fec3"} Oct 01 11:22:03 crc kubenswrapper[4735]: I1001 11:22:03.320461 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9fp2" event={"ID":"08457cf6-869f-4289-add8-7ee5683b189c","Type":"ContainerStarted","Data":"53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e"} Oct 01 11:22:04 crc kubenswrapper[4735]: I1001 11:22:04.328384 4735 generic.go:334] "Generic (PLEG): container finished" podID="08457cf6-869f-4289-add8-7ee5683b189c" containerID="53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e" exitCode=0 Oct 01 11:22:04 crc kubenswrapper[4735]: I1001 11:22:04.328430 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9fp2" event={"ID":"08457cf6-869f-4289-add8-7ee5683b189c","Type":"ContainerDied","Data":"53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e"} Oct 01 11:22:04 crc kubenswrapper[4735]: I1001 11:22:04.897538 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:22:04 crc kubenswrapper[4735]: E1001 11:22:04.897729 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:22:05 crc kubenswrapper[4735]: I1001 11:22:05.341521 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9fp2" event={"ID":"08457cf6-869f-4289-add8-7ee5683b189c","Type":"ContainerStarted","Data":"f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f"} Oct 01 11:22:05 crc kubenswrapper[4735]: I1001 11:22:05.364278 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b9fp2" podStartSLOduration=1.970977667 podStartE2EDuration="4.364245945s" podCreationTimestamp="2025-10-01 11:22:01 +0000 UTC" firstStartedPulling="2025-10-01 11:22:02.311861817 +0000 UTC m=+3881.004683079" lastFinishedPulling="2025-10-01 11:22:04.705130095 +0000 UTC m=+3883.397951357" observedRunningTime="2025-10-01 11:22:05.355696115 +0000 UTC m=+3884.048517377" watchObservedRunningTime="2025-10-01 11:22:05.364245945 +0000 UTC m=+3884.057067207" Oct 01 11:22:06 crc kubenswrapper[4735]: I1001 11:22:06.625877 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hr556/crc-debug-hffp8"] Oct 01 11:22:06 crc kubenswrapper[4735]: I1001 11:22:06.639301 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hr556/crc-debug-hffp8"] Oct 01 11:22:07 crc kubenswrapper[4735]: I1001 11:22:07.820818 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hr556/crc-debug-52z6k"] Oct 01 11:22:07 crc kubenswrapper[4735]: E1001 11:22:07.821317 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c452854b-71a8-4aad-baee-9c4e59b2d082" containerName="container-00" Oct 01 11:22:07 crc kubenswrapper[4735]: I1001 11:22:07.821340 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c452854b-71a8-4aad-baee-9c4e59b2d082" containerName="container-00" Oct 01 11:22:07 crc kubenswrapper[4735]: I1001 11:22:07.822753 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c452854b-71a8-4aad-baee-9c4e59b2d082" containerName="container-00" Oct 01 11:22:07 crc kubenswrapper[4735]: I1001 11:22:07.823886 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-52z6k" Oct 01 11:22:07 crc kubenswrapper[4735]: I1001 11:22:07.916894 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dba5b75-fd33-4c97-b3ab-c27427575af3-host\") pod \"crc-debug-52z6k\" (UID: \"4dba5b75-fd33-4c97-b3ab-c27427575af3\") " pod="openshift-must-gather-hr556/crc-debug-52z6k" Oct 01 11:22:07 crc kubenswrapper[4735]: I1001 11:22:07.917076 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cbfs\" (UniqueName: \"kubernetes.io/projected/4dba5b75-fd33-4c97-b3ab-c27427575af3-kube-api-access-9cbfs\") pod \"crc-debug-52z6k\" (UID: \"4dba5b75-fd33-4c97-b3ab-c27427575af3\") " pod="openshift-must-gather-hr556/crc-debug-52z6k" Oct 01 11:22:07 crc kubenswrapper[4735]: I1001 11:22:07.926748 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c452854b-71a8-4aad-baee-9c4e59b2d082" path="/var/lib/kubelet/pods/c452854b-71a8-4aad-baee-9c4e59b2d082/volumes" Oct 01 11:22:08 crc kubenswrapper[4735]: I1001 11:22:08.018821 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dba5b75-fd33-4c97-b3ab-c27427575af3-host\") pod \"crc-debug-52z6k\" (UID: \"4dba5b75-fd33-4c97-b3ab-c27427575af3\") " pod="openshift-must-gather-hr556/crc-debug-52z6k" Oct 01 11:22:08 crc kubenswrapper[4735]: I1001 11:22:08.019038 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cbfs\" (UniqueName: \"kubernetes.io/projected/4dba5b75-fd33-4c97-b3ab-c27427575af3-kube-api-access-9cbfs\") pod \"crc-debug-52z6k\" (UID: \"4dba5b75-fd33-4c97-b3ab-c27427575af3\") " pod="openshift-must-gather-hr556/crc-debug-52z6k" Oct 01 11:22:08 crc kubenswrapper[4735]: I1001 11:22:08.019611 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dba5b75-fd33-4c97-b3ab-c27427575af3-host\") pod \"crc-debug-52z6k\" (UID: \"4dba5b75-fd33-4c97-b3ab-c27427575af3\") " pod="openshift-must-gather-hr556/crc-debug-52z6k" Oct 01 11:22:08 crc kubenswrapper[4735]: I1001 11:22:08.049249 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cbfs\" (UniqueName: \"kubernetes.io/projected/4dba5b75-fd33-4c97-b3ab-c27427575af3-kube-api-access-9cbfs\") pod \"crc-debug-52z6k\" (UID: \"4dba5b75-fd33-4c97-b3ab-c27427575af3\") " pod="openshift-must-gather-hr556/crc-debug-52z6k" Oct 01 11:22:08 crc kubenswrapper[4735]: I1001 11:22:08.158992 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-52z6k" Oct 01 11:22:08 crc kubenswrapper[4735]: W1001 11:22:08.202890 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dba5b75_fd33_4c97_b3ab_c27427575af3.slice/crio-e082337c20ab76491544ef610dbbeda1f8d083a4b74515e021d2be7d894fef4d WatchSource:0}: Error finding container e082337c20ab76491544ef610dbbeda1f8d083a4b74515e021d2be7d894fef4d: Status 404 returned error can't find the container with id e082337c20ab76491544ef610dbbeda1f8d083a4b74515e021d2be7d894fef4d Oct 01 11:22:08 crc kubenswrapper[4735]: I1001 11:22:08.371424 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/crc-debug-52z6k" event={"ID":"4dba5b75-fd33-4c97-b3ab-c27427575af3","Type":"ContainerStarted","Data":"e082337c20ab76491544ef610dbbeda1f8d083a4b74515e021d2be7d894fef4d"} Oct 01 11:22:09 crc kubenswrapper[4735]: I1001 11:22:09.384660 4735 generic.go:334] "Generic (PLEG): container finished" podID="4dba5b75-fd33-4c97-b3ab-c27427575af3" containerID="b80ef42591117d5ce36f5c5b33ea6582789f414a6ca1b734f825043a9f58b05e" exitCode=0 Oct 01 11:22:09 crc kubenswrapper[4735]: I1001 11:22:09.384744 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/crc-debug-52z6k" event={"ID":"4dba5b75-fd33-4c97-b3ab-c27427575af3","Type":"ContainerDied","Data":"b80ef42591117d5ce36f5c5b33ea6582789f414a6ca1b734f825043a9f58b05e"} Oct 01 11:22:09 crc kubenswrapper[4735]: I1001 11:22:09.434350 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hr556/crc-debug-52z6k"] Oct 01 11:22:09 crc kubenswrapper[4735]: I1001 11:22:09.442927 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hr556/crc-debug-52z6k"] Oct 01 11:22:10 crc kubenswrapper[4735]: I1001 11:22:10.518775 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-52z6k" Oct 01 11:22:10 crc kubenswrapper[4735]: I1001 11:22:10.676302 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dba5b75-fd33-4c97-b3ab-c27427575af3-host\") pod \"4dba5b75-fd33-4c97-b3ab-c27427575af3\" (UID: \"4dba5b75-fd33-4c97-b3ab-c27427575af3\") " Oct 01 11:22:10 crc kubenswrapper[4735]: I1001 11:22:10.676454 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cbfs\" (UniqueName: \"kubernetes.io/projected/4dba5b75-fd33-4c97-b3ab-c27427575af3-kube-api-access-9cbfs\") pod \"4dba5b75-fd33-4c97-b3ab-c27427575af3\" (UID: \"4dba5b75-fd33-4c97-b3ab-c27427575af3\") " Oct 01 11:22:10 crc kubenswrapper[4735]: I1001 11:22:10.676441 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4dba5b75-fd33-4c97-b3ab-c27427575af3-host" (OuterVolumeSpecName: "host") pod "4dba5b75-fd33-4c97-b3ab-c27427575af3" (UID: "4dba5b75-fd33-4c97-b3ab-c27427575af3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 11:22:10 crc kubenswrapper[4735]: I1001 11:22:10.684798 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dba5b75-fd33-4c97-b3ab-c27427575af3-kube-api-access-9cbfs" (OuterVolumeSpecName: "kube-api-access-9cbfs") pod "4dba5b75-fd33-4c97-b3ab-c27427575af3" (UID: "4dba5b75-fd33-4c97-b3ab-c27427575af3"). InnerVolumeSpecName "kube-api-access-9cbfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:22:10 crc kubenswrapper[4735]: I1001 11:22:10.778126 4735 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4dba5b75-fd33-4c97-b3ab-c27427575af3-host\") on node \"crc\" DevicePath \"\"" Oct 01 11:22:10 crc kubenswrapper[4735]: I1001 11:22:10.778172 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cbfs\" (UniqueName: \"kubernetes.io/projected/4dba5b75-fd33-4c97-b3ab-c27427575af3-kube-api-access-9cbfs\") on node \"crc\" DevicePath \"\"" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.070871 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/util/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.224678 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/util/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.230879 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/pull/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.284879 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/pull/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.404917 4735 scope.go:117] "RemoveContainer" containerID="b80ef42591117d5ce36f5c5b33ea6582789f414a6ca1b734f825043a9f58b05e" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.404954 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/crc-debug-52z6k" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.473777 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/pull/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.477513 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/util/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.523192 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7243c1655a6bfc216b2a92921d0d302a75b0a6c243817f600a16a9402esz2rc_820a7ecc-7cdd-4ef7-8b74-64419cb96b2d/extract/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.574742 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.574796 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.638725 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.651696 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-v88gl_27b0c660-ca46-48cb-88ca-bb5715532c80/kube-rbac-proxy/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.706485 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-v88gl_27b0c660-ca46-48cb-88ca-bb5715532c80/manager/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.770109 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-l5lmn_db6d8b12-0a21-40a5-b23a-e943494a2091/kube-rbac-proxy/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.871467 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-l5lmn_db6d8b12-0a21-40a5-b23a-e943494a2091/manager/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.918631 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dba5b75-fd33-4c97-b3ab-c27427575af3" path="/var/lib/kubelet/pods/4dba5b75-fd33-4c97-b3ab-c27427575af3/volumes" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.947739 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-fj5j6_263d49ed-e577-457b-b887-33f95f1bbed0/kube-rbac-proxy/0.log" Oct 01 11:22:11 crc kubenswrapper[4735]: I1001 11:22:11.989122 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-fj5j6_263d49ed-e577-457b-b887-33f95f1bbed0/manager/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.083893 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-hgmvf_bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98/kube-rbac-proxy/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.182735 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-hgmvf_bb3f6d5e-c5f1-40a5-b85e-dc33dd683d98/manager/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.253307 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-qp9c7_c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9/manager/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.279632 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-qp9c7_c229f8d8-2a3c-4a42-ae75-f83e7bbc99e9/kube-rbac-proxy/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.361342 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-zsvgx_7df184a1-eb46-4e19-85cd-82e8d5da1880/kube-rbac-proxy/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.467507 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.477222 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-zsvgx_7df184a1-eb46-4e19-85cd-82e8d5da1880/manager/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.512251 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9fp2"] Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.512673 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-txp8d_a48cc547-6994-48eb-b8db-9682c091fdac/kube-rbac-proxy/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.689733 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-hw4lp_8a30378d-5948-4a39-b5ec-85b29f5763e9/kube-rbac-proxy/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.709843 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-txp8d_a48cc547-6994-48eb-b8db-9682c091fdac/manager/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.724811 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-hw4lp_8a30378d-5948-4a39-b5ec-85b29f5763e9/manager/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.857348 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-wftzg_628c079b-418b-4299-9394-a59ab2850d23/kube-rbac-proxy/0.log" Oct 01 11:22:12 crc kubenswrapper[4735]: I1001 11:22:12.931103 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-wftzg_628c079b-418b-4299-9394-a59ab2850d23/manager/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.013357 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-7hcv5_a1e6fcf2-bfad-48fe-b655-0e9199818230/kube-rbac-proxy/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.053066 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-7hcv5_a1e6fcf2-bfad-48fe-b655-0e9199818230/manager/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.135933 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-m5vc9_912f537c-db9b-4256-a5e0-81dc33bcaf3e/kube-rbac-proxy/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.259243 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-m5vc9_912f537c-db9b-4256-a5e0-81dc33bcaf3e/manager/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.271288 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-9k2wg_663344ae-dd00-416c-9120-d4f0721554b4/kube-rbac-proxy/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.318804 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-9k2wg_663344ae-dd00-416c-9120-d4f0721554b4/manager/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.423964 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-mr6vv_3fd0c004-178e-41cb-be27-2d2342d9f58c/kube-rbac-proxy/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.594347 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-mr6vv_3fd0c004-178e-41cb-be27-2d2342d9f58c/manager/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.645103 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-lg9mq_6b1d4e62-3eb6-4090-826d-e627e08c73c6/kube-rbac-proxy/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.655520 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-lg9mq_6b1d4e62-3eb6-4090-826d-e627e08c73c6/manager/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.780441 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cllzwc_ef40c559-9b0d-478a-baa4-239ab6d71d76/kube-rbac-proxy/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.829239 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cllzwc_ef40c559-9b0d-478a-baa4-239ab6d71d76/manager/0.log" Oct 01 11:22:13 crc kubenswrapper[4735]: I1001 11:22:13.890460 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b86d7dbdd-k9n2c_86607e6f-a912-47f7-b72c-8ea925c5bd53/kube-rbac-proxy/0.log" Oct 01 11:22:14 crc kubenswrapper[4735]: I1001 11:22:14.069943 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-794c859bbc-8kzp5_12603a12-a744-42f1-b0fd-e1a88e490d81/kube-rbac-proxy/0.log" Oct 01 11:22:14 crc kubenswrapper[4735]: I1001 11:22:14.309122 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jmdr7_bf036ed7-e2ad-407c-94d5-ce386d9884b8/registry-server/0.log" Oct 01 11:22:14 crc kubenswrapper[4735]: I1001 11:22:14.345090 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-794c859bbc-8kzp5_12603a12-a744-42f1-b0fd-e1a88e490d81/operator/0.log" Oct 01 11:22:14 crc kubenswrapper[4735]: I1001 11:22:14.406428 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-2pfnh_ef5f656f-602f-475c-8bd2-078c0bb43388/kube-rbac-proxy/0.log" Oct 01 11:22:14 crc kubenswrapper[4735]: I1001 11:22:14.442003 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b9fp2" podUID="08457cf6-869f-4289-add8-7ee5683b189c" containerName="registry-server" containerID="cri-o://f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f" gracePeriod=2 Oct 01 11:22:14 crc kubenswrapper[4735]: I1001 11:22:14.567113 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-2pfnh_ef5f656f-602f-475c-8bd2-078c0bb43388/manager/0.log" Oct 01 11:22:14 crc kubenswrapper[4735]: I1001 11:22:14.657528 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-p8c9g_9fba9aa7-25bd-4d48-89e9-818af62e38af/kube-rbac-proxy/0.log" Oct 01 11:22:14 crc kubenswrapper[4735]: I1001 11:22:14.669672 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-p8c9g_9fba9aa7-25bd-4d48-89e9-818af62e38af/manager/0.log" Oct 01 11:22:14 crc kubenswrapper[4735]: I1001 11:22:14.888507 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-j6dlz_50560653-c3f8-4fa8-9d19-a1525b1daaa2/kube-rbac-proxy/0.log" Oct 01 11:22:14 crc kubenswrapper[4735]: I1001 11:22:14.909957 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:14 crc kubenswrapper[4735]: I1001 11:22:14.929155 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-k2j2p_d285f86e-bf4c-4e84-8b59-039754ffb39c/operator/0.log" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.047584 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-j6dlz_50560653-c3f8-4fa8-9d19-a1525b1daaa2/manager/0.log" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.048626 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b86d7dbdd-k9n2c_86607e6f-a912-47f7-b72c-8ea925c5bd53/manager/0.log" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.066033 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-catalog-content\") pod \"08457cf6-869f-4289-add8-7ee5683b189c\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.066296 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-utilities\") pod \"08457cf6-869f-4289-add8-7ee5683b189c\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.066431 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hhqd\" (UniqueName: \"kubernetes.io/projected/08457cf6-869f-4289-add8-7ee5683b189c-kube-api-access-4hhqd\") pod \"08457cf6-869f-4289-add8-7ee5683b189c\" (UID: \"08457cf6-869f-4289-add8-7ee5683b189c\") " Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.069276 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-utilities" (OuterVolumeSpecName: "utilities") pod "08457cf6-869f-4289-add8-7ee5683b189c" (UID: "08457cf6-869f-4289-add8-7ee5683b189c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.086327 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08457cf6-869f-4289-add8-7ee5683b189c-kube-api-access-4hhqd" (OuterVolumeSpecName: "kube-api-access-4hhqd") pod "08457cf6-869f-4289-add8-7ee5683b189c" (UID: "08457cf6-869f-4289-add8-7ee5683b189c"). InnerVolumeSpecName "kube-api-access-4hhqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.114169 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08457cf6-869f-4289-add8-7ee5683b189c" (UID: "08457cf6-869f-4289-add8-7ee5683b189c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.115678 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-svvjg_82c5ecde-4ff5-42bc-9956-45d025d53f45/kube-rbac-proxy/0.log" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.168107 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hhqd\" (UniqueName: \"kubernetes.io/projected/08457cf6-869f-4289-add8-7ee5683b189c-kube-api-access-4hhqd\") on node \"crc\" DevicePath \"\"" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.168139 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.168148 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08457cf6-869f-4289-add8-7ee5683b189c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.184071 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-svvjg_82c5ecde-4ff5-42bc-9956-45d025d53f45/manager/0.log" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.255205 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-5w5cv_dfaa86bf-1e61-4393-ba0f-b9003fdbde80/kube-rbac-proxy/0.log" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.274574 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-5w5cv_dfaa86bf-1e61-4393-ba0f-b9003fdbde80/manager/0.log" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.351452 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-f29js_c4ac4b8f-7378-438b-8412-1b74d1c3fda9/kube-rbac-proxy/0.log" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.390331 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-f29js_c4ac4b8f-7378-438b-8412-1b74d1c3fda9/manager/0.log" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.452896 4735 generic.go:334] "Generic (PLEG): container finished" podID="08457cf6-869f-4289-add8-7ee5683b189c" containerID="f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f" exitCode=0 Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.452974 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9fp2" event={"ID":"08457cf6-869f-4289-add8-7ee5683b189c","Type":"ContainerDied","Data":"f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f"} Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.452986 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9fp2" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.453564 4735 scope.go:117] "RemoveContainer" containerID="f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.453463 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9fp2" event={"ID":"08457cf6-869f-4289-add8-7ee5683b189c","Type":"ContainerDied","Data":"9ca8506a1ef520011bbf85e8f085299b8c7f51949b2982d8fbb8c1ab8a85fec3"} Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.481698 4735 scope.go:117] "RemoveContainer" containerID="53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.500912 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9fp2"] Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.510292 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b9fp2"] Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.523699 4735 scope.go:117] "RemoveContainer" containerID="659484e1b05661ce1baedbe4d07805e74a05f4b4bea55955d4852655c4867881" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.564585 4735 scope.go:117] "RemoveContainer" containerID="f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f" Oct 01 11:22:15 crc kubenswrapper[4735]: E1001 11:22:15.565101 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f\": container with ID starting with f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f not found: ID does not exist" containerID="f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.565152 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f"} err="failed to get container status \"f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f\": rpc error: code = NotFound desc = could not find container \"f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f\": container with ID starting with f06d01c8524285e7c3dc79b275f3ec53d4642c861c60c7727ac22a0cfef8525f not found: ID does not exist" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.565188 4735 scope.go:117] "RemoveContainer" containerID="53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e" Oct 01 11:22:15 crc kubenswrapper[4735]: E1001 11:22:15.565626 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e\": container with ID starting with 53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e not found: ID does not exist" containerID="53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.565655 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e"} err="failed to get container status \"53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e\": rpc error: code = NotFound desc = could not find container \"53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e\": container with ID starting with 53a7026349b28ee724e061a6d1c3e03660e012877b93ad9444059867e343d79e not found: ID does not exist" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.565673 4735 scope.go:117] "RemoveContainer" containerID="659484e1b05661ce1baedbe4d07805e74a05f4b4bea55955d4852655c4867881" Oct 01 11:22:15 crc kubenswrapper[4735]: E1001 11:22:15.565933 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"659484e1b05661ce1baedbe4d07805e74a05f4b4bea55955d4852655c4867881\": container with ID starting with 659484e1b05661ce1baedbe4d07805e74a05f4b4bea55955d4852655c4867881 not found: ID does not exist" containerID="659484e1b05661ce1baedbe4d07805e74a05f4b4bea55955d4852655c4867881" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.565959 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"659484e1b05661ce1baedbe4d07805e74a05f4b4bea55955d4852655c4867881"} err="failed to get container status \"659484e1b05661ce1baedbe4d07805e74a05f4b4bea55955d4852655c4867881\": rpc error: code = NotFound desc = could not find container \"659484e1b05661ce1baedbe4d07805e74a05f4b4bea55955d4852655c4867881\": container with ID starting with 659484e1b05661ce1baedbe4d07805e74a05f4b4bea55955d4852655c4867881 not found: ID does not exist" Oct 01 11:22:15 crc kubenswrapper[4735]: I1001 11:22:15.909760 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08457cf6-869f-4289-add8-7ee5683b189c" path="/var/lib/kubelet/pods/08457cf6-869f-4289-add8-7ee5683b189c/volumes" Oct 01 11:22:19 crc kubenswrapper[4735]: I1001 11:22:19.897437 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:22:19 crc kubenswrapper[4735]: E1001 11:22:19.898051 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.158296 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2wtgx"] Oct 01 11:22:23 crc kubenswrapper[4735]: E1001 11:22:23.159206 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dba5b75-fd33-4c97-b3ab-c27427575af3" containerName="container-00" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.159218 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dba5b75-fd33-4c97-b3ab-c27427575af3" containerName="container-00" Oct 01 11:22:23 crc kubenswrapper[4735]: E1001 11:22:23.159244 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08457cf6-869f-4289-add8-7ee5683b189c" containerName="extract-utilities" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.159250 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="08457cf6-869f-4289-add8-7ee5683b189c" containerName="extract-utilities" Oct 01 11:22:23 crc kubenswrapper[4735]: E1001 11:22:23.159266 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08457cf6-869f-4289-add8-7ee5683b189c" containerName="registry-server" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.159272 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="08457cf6-869f-4289-add8-7ee5683b189c" containerName="registry-server" Oct 01 11:22:23 crc kubenswrapper[4735]: E1001 11:22:23.159284 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08457cf6-869f-4289-add8-7ee5683b189c" containerName="extract-content" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.159290 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="08457cf6-869f-4289-add8-7ee5683b189c" containerName="extract-content" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.159462 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dba5b75-fd33-4c97-b3ab-c27427575af3" containerName="container-00" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.159471 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="08457cf6-869f-4289-add8-7ee5683b189c" containerName="registry-server" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.160797 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.183921 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2wtgx"] Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.215084 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-utilities\") pod \"redhat-operators-2wtgx\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.215308 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-catalog-content\") pod \"redhat-operators-2wtgx\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.215680 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98pj\" (UniqueName: \"kubernetes.io/projected/40513913-81d5-4f37-8fcf-638523b68deb-kube-api-access-s98pj\") pod \"redhat-operators-2wtgx\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.316821 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98pj\" (UniqueName: \"kubernetes.io/projected/40513913-81d5-4f37-8fcf-638523b68deb-kube-api-access-s98pj\") pod \"redhat-operators-2wtgx\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.316876 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-utilities\") pod \"redhat-operators-2wtgx\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.316937 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-catalog-content\") pod \"redhat-operators-2wtgx\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.317568 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-catalog-content\") pod \"redhat-operators-2wtgx\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.317623 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-utilities\") pod \"redhat-operators-2wtgx\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.340317 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98pj\" (UniqueName: \"kubernetes.io/projected/40513913-81d5-4f37-8fcf-638523b68deb-kube-api-access-s98pj\") pod \"redhat-operators-2wtgx\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.526786 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:23 crc kubenswrapper[4735]: I1001 11:22:23.782298 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2wtgx"] Oct 01 11:22:24 crc kubenswrapper[4735]: I1001 11:22:24.536224 4735 generic.go:334] "Generic (PLEG): container finished" podID="40513913-81d5-4f37-8fcf-638523b68deb" containerID="699f5af882c4d9edee5969bd8599373a2bb75f3f2739d73b6e38d2b3c81b827c" exitCode=0 Oct 01 11:22:24 crc kubenswrapper[4735]: I1001 11:22:24.536269 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wtgx" event={"ID":"40513913-81d5-4f37-8fcf-638523b68deb","Type":"ContainerDied","Data":"699f5af882c4d9edee5969bd8599373a2bb75f3f2739d73b6e38d2b3c81b827c"} Oct 01 11:22:24 crc kubenswrapper[4735]: I1001 11:22:24.536718 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wtgx" event={"ID":"40513913-81d5-4f37-8fcf-638523b68deb","Type":"ContainerStarted","Data":"d2bc2096dc86b41c968e54e20e6d5db73b781fc5ce19fcf4cee1a27664372254"} Oct 01 11:22:26 crc kubenswrapper[4735]: I1001 11:22:26.556154 4735 generic.go:334] "Generic (PLEG): container finished" podID="40513913-81d5-4f37-8fcf-638523b68deb" containerID="c0865242a8ee5fd3d0b044c38bee8add43080fcd8109d70b9d2fc1d969ed65d6" exitCode=0 Oct 01 11:22:26 crc kubenswrapper[4735]: I1001 11:22:26.556373 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wtgx" event={"ID":"40513913-81d5-4f37-8fcf-638523b68deb","Type":"ContainerDied","Data":"c0865242a8ee5fd3d0b044c38bee8add43080fcd8109d70b9d2fc1d969ed65d6"} Oct 01 11:22:27 crc kubenswrapper[4735]: I1001 11:22:27.567558 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wtgx" event={"ID":"40513913-81d5-4f37-8fcf-638523b68deb","Type":"ContainerStarted","Data":"2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec"} Oct 01 11:22:27 crc kubenswrapper[4735]: I1001 11:22:27.597317 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2wtgx" podStartSLOduration=2.028335187 podStartE2EDuration="4.597295621s" podCreationTimestamp="2025-10-01 11:22:23 +0000 UTC" firstStartedPulling="2025-10-01 11:22:24.537995258 +0000 UTC m=+3903.230816520" lastFinishedPulling="2025-10-01 11:22:27.106955692 +0000 UTC m=+3905.799776954" observedRunningTime="2025-10-01 11:22:27.590737696 +0000 UTC m=+3906.283558968" watchObservedRunningTime="2025-10-01 11:22:27.597295621 +0000 UTC m=+3906.290116883" Oct 01 11:22:28 crc kubenswrapper[4735]: I1001 11:22:28.058681 4735 scope.go:117] "RemoveContainer" containerID="80272c7462acbb8b675f70e6d9afc1922785fb4886dc6299b6aa50a7c2bee27f" Oct 01 11:22:32 crc kubenswrapper[4735]: I1001 11:22:32.262591 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-76rns_1d4a525c-7b1a-4bde-976a-d4b938c27209/control-plane-machine-set-operator/0.log" Oct 01 11:22:32 crc kubenswrapper[4735]: I1001 11:22:32.419474 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kgr86_6963210d-abf6-43ad-80ea-72831b6d7504/kube-rbac-proxy/0.log" Oct 01 11:22:32 crc kubenswrapper[4735]: I1001 11:22:32.474961 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kgr86_6963210d-abf6-43ad-80ea-72831b6d7504/machine-api-operator/0.log" Oct 01 11:22:33 crc kubenswrapper[4735]: I1001 11:22:33.527274 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:33 crc kubenswrapper[4735]: I1001 11:22:33.527332 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:33 crc kubenswrapper[4735]: I1001 11:22:33.596196 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:33 crc kubenswrapper[4735]: I1001 11:22:33.681559 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:33 crc kubenswrapper[4735]: I1001 11:22:33.832610 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2wtgx"] Oct 01 11:22:34 crc kubenswrapper[4735]: I1001 11:22:34.896729 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:22:34 crc kubenswrapper[4735]: E1001 11:22:34.897314 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:22:35 crc kubenswrapper[4735]: I1001 11:22:35.646722 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2wtgx" podUID="40513913-81d5-4f37-8fcf-638523b68deb" containerName="registry-server" containerID="cri-o://2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec" gracePeriod=2 Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.246482 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.378993 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-utilities\") pod \"40513913-81d5-4f37-8fcf-638523b68deb\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.379066 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s98pj\" (UniqueName: \"kubernetes.io/projected/40513913-81d5-4f37-8fcf-638523b68deb-kube-api-access-s98pj\") pod \"40513913-81d5-4f37-8fcf-638523b68deb\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.379175 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-catalog-content\") pod \"40513913-81d5-4f37-8fcf-638523b68deb\" (UID: \"40513913-81d5-4f37-8fcf-638523b68deb\") " Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.379935 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-utilities" (OuterVolumeSpecName: "utilities") pod "40513913-81d5-4f37-8fcf-638523b68deb" (UID: "40513913-81d5-4f37-8fcf-638523b68deb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.385329 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40513913-81d5-4f37-8fcf-638523b68deb-kube-api-access-s98pj" (OuterVolumeSpecName: "kube-api-access-s98pj") pod "40513913-81d5-4f37-8fcf-638523b68deb" (UID: "40513913-81d5-4f37-8fcf-638523b68deb"). InnerVolumeSpecName "kube-api-access-s98pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.471313 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40513913-81d5-4f37-8fcf-638523b68deb" (UID: "40513913-81d5-4f37-8fcf-638523b68deb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.481293 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s98pj\" (UniqueName: \"kubernetes.io/projected/40513913-81d5-4f37-8fcf-638523b68deb-kube-api-access-s98pj\") on node \"crc\" DevicePath \"\"" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.481327 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.481340 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40513913-81d5-4f37-8fcf-638523b68deb-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.656846 4735 generic.go:334] "Generic (PLEG): container finished" podID="40513913-81d5-4f37-8fcf-638523b68deb" containerID="2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec" exitCode=0 Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.656888 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wtgx" event={"ID":"40513913-81d5-4f37-8fcf-638523b68deb","Type":"ContainerDied","Data":"2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec"} Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.656941 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2wtgx" event={"ID":"40513913-81d5-4f37-8fcf-638523b68deb","Type":"ContainerDied","Data":"d2bc2096dc86b41c968e54e20e6d5db73b781fc5ce19fcf4cee1a27664372254"} Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.656954 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2wtgx" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.656960 4735 scope.go:117] "RemoveContainer" containerID="2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.681165 4735 scope.go:117] "RemoveContainer" containerID="c0865242a8ee5fd3d0b044c38bee8add43080fcd8109d70b9d2fc1d969ed65d6" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.718561 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2wtgx"] Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.728133 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2wtgx"] Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.729045 4735 scope.go:117] "RemoveContainer" containerID="699f5af882c4d9edee5969bd8599373a2bb75f3f2739d73b6e38d2b3c81b827c" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.754859 4735 scope.go:117] "RemoveContainer" containerID="2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec" Oct 01 11:22:36 crc kubenswrapper[4735]: E1001 11:22:36.761639 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec\": container with ID starting with 2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec not found: ID does not exist" containerID="2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.761692 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec"} err="failed to get container status \"2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec\": rpc error: code = NotFound desc = could not find container \"2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec\": container with ID starting with 2305add09dfb5133288f36574e26e18925dda9a09aa7fff05c915187021e67ec not found: ID does not exist" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.761718 4735 scope.go:117] "RemoveContainer" containerID="c0865242a8ee5fd3d0b044c38bee8add43080fcd8109d70b9d2fc1d969ed65d6" Oct 01 11:22:36 crc kubenswrapper[4735]: E1001 11:22:36.762065 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0865242a8ee5fd3d0b044c38bee8add43080fcd8109d70b9d2fc1d969ed65d6\": container with ID starting with c0865242a8ee5fd3d0b044c38bee8add43080fcd8109d70b9d2fc1d969ed65d6 not found: ID does not exist" containerID="c0865242a8ee5fd3d0b044c38bee8add43080fcd8109d70b9d2fc1d969ed65d6" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.762092 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0865242a8ee5fd3d0b044c38bee8add43080fcd8109d70b9d2fc1d969ed65d6"} err="failed to get container status \"c0865242a8ee5fd3d0b044c38bee8add43080fcd8109d70b9d2fc1d969ed65d6\": rpc error: code = NotFound desc = could not find container \"c0865242a8ee5fd3d0b044c38bee8add43080fcd8109d70b9d2fc1d969ed65d6\": container with ID starting with c0865242a8ee5fd3d0b044c38bee8add43080fcd8109d70b9d2fc1d969ed65d6 not found: ID does not exist" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.762108 4735 scope.go:117] "RemoveContainer" containerID="699f5af882c4d9edee5969bd8599373a2bb75f3f2739d73b6e38d2b3c81b827c" Oct 01 11:22:36 crc kubenswrapper[4735]: E1001 11:22:36.762376 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"699f5af882c4d9edee5969bd8599373a2bb75f3f2739d73b6e38d2b3c81b827c\": container with ID starting with 699f5af882c4d9edee5969bd8599373a2bb75f3f2739d73b6e38d2b3c81b827c not found: ID does not exist" containerID="699f5af882c4d9edee5969bd8599373a2bb75f3f2739d73b6e38d2b3c81b827c" Oct 01 11:22:36 crc kubenswrapper[4735]: I1001 11:22:36.762411 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"699f5af882c4d9edee5969bd8599373a2bb75f3f2739d73b6e38d2b3c81b827c"} err="failed to get container status \"699f5af882c4d9edee5969bd8599373a2bb75f3f2739d73b6e38d2b3c81b827c\": rpc error: code = NotFound desc = could not find container \"699f5af882c4d9edee5969bd8599373a2bb75f3f2739d73b6e38d2b3c81b827c\": container with ID starting with 699f5af882c4d9edee5969bd8599373a2bb75f3f2739d73b6e38d2b3c81b827c not found: ID does not exist" Oct 01 11:22:37 crc kubenswrapper[4735]: I1001 11:22:37.907824 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40513913-81d5-4f37-8fcf-638523b68deb" path="/var/lib/kubelet/pods/40513913-81d5-4f37-8fcf-638523b68deb/volumes" Oct 01 11:22:45 crc kubenswrapper[4735]: I1001 11:22:45.851829 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-srpsm_4a6804d5-21c5-4d3d-9504-d769df881c52/cert-manager-controller/0.log" Oct 01 11:22:45 crc kubenswrapper[4735]: I1001 11:22:45.897988 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:22:45 crc kubenswrapper[4735]: E1001 11:22:45.898237 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:22:45 crc kubenswrapper[4735]: I1001 11:22:45.967300 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-mvxdt_63d28348-4347-431a-97e8-3526e9f66a68/cert-manager-cainjector/0.log" Oct 01 11:22:46 crc kubenswrapper[4735]: I1001 11:22:46.045091 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-d2xk7_a4f1a7ac-f922-4a82-8675-c87e0921512f/cert-manager-webhook/0.log" Oct 01 11:22:56 crc kubenswrapper[4735]: I1001 11:22:56.897420 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:22:56 crc kubenswrapper[4735]: E1001 11:22:56.898228 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xgg24_openshift-machine-config-operator(8c2fdbf0-2469-4ca0-8624-d63609123cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" Oct 01 11:22:58 crc kubenswrapper[4735]: I1001 11:22:58.980284 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-svjp6_cb894fbc-36ef-4c41-ae21-dff369c41c99/nmstate-console-plugin/0.log" Oct 01 11:22:59 crc kubenswrapper[4735]: I1001 11:22:59.098650 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-p4k5f_b82af7f6-7fb7-4e7a-9787-1f3b84969763/nmstate-handler/0.log" Oct 01 11:22:59 crc kubenswrapper[4735]: I1001 11:22:59.838210 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-mpptn_6f89445f-bbda-4a4e-8cc5-ceb03718ffed/kube-rbac-proxy/0.log" Oct 01 11:22:59 crc kubenswrapper[4735]: I1001 11:22:59.840256 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-mpptn_6f89445f-bbda-4a4e-8cc5-ceb03718ffed/nmstate-metrics/0.log" Oct 01 11:23:00 crc kubenswrapper[4735]: I1001 11:23:00.014026 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-m6pnj_7da3bf5b-a383-430c-b587-62c7eabeedd1/nmstate-operator/0.log" Oct 01 11:23:00 crc kubenswrapper[4735]: I1001 11:23:00.033488 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-xsrh7_f37e734e-18a9-4b41-b06d-1da35b2d5654/nmstate-webhook/0.log" Oct 01 11:23:07 crc kubenswrapper[4735]: I1001 11:23:07.896851 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53" Oct 01 11:23:09 crc kubenswrapper[4735]: I1001 11:23:09.015482 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"ac26ab6dac2e24f997bbaade511de5f46e894f657c32f435041bb14d4dca5900"} Oct 01 11:23:13 crc kubenswrapper[4735]: I1001 11:23:13.513412 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-6nnhj_d76a5dcc-a5c5-435c-9dfc-11bab4a422e9/kube-rbac-proxy/0.log" Oct 01 11:23:13 crc kubenswrapper[4735]: I1001 11:23:13.592340 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-6nnhj_d76a5dcc-a5c5-435c-9dfc-11bab4a422e9/controller/0.log" Oct 01 11:23:13 crc kubenswrapper[4735]: I1001 11:23:13.737462 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-frr-files/0.log" Oct 01 11:23:13 crc kubenswrapper[4735]: I1001 11:23:13.904597 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-frr-files/0.log" Oct 01 11:23:13 crc kubenswrapper[4735]: I1001 11:23:13.915014 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-reloader/0.log" Oct 01 11:23:13 crc kubenswrapper[4735]: I1001 11:23:13.915262 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-metrics/0.log" Oct 01 11:23:13 crc kubenswrapper[4735]: I1001 11:23:13.939259 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-reloader/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.115585 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-frr-files/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.138002 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-metrics/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.140204 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-metrics/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.185745 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-reloader/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.276111 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-frr-files/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.297980 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-reloader/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.299490 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/cp-metrics/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.367636 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/controller/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.481380 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/kube-rbac-proxy/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.519652 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/frr-metrics/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.567098 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/kube-rbac-proxy-frr/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.698574 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/reloader/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.817127 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-cgg5l_d45651e4-469d-458c-9d48-ad996f82c3f0/frr-k8s-webhook-server/0.log" Oct 01 11:23:14 crc kubenswrapper[4735]: I1001 11:23:14.971056 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-755f8bc9ff-4w5jq_a1a673c2-160f-4f1a-8bcf-fbc1e3692cb9/manager/0.log" Oct 01 11:23:15 crc kubenswrapper[4735]: I1001 11:23:15.128918 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5455ff795f-xpx6b_db9781ea-0490-415d-8e5b-7b64d4aa62dd/webhook-server/0.log" Oct 01 11:23:15 crc kubenswrapper[4735]: I1001 11:23:15.300599 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9c82t_ea017fc5-1856-47da-99a5-c866738be35e/kube-rbac-proxy/0.log" Oct 01 11:23:15 crc kubenswrapper[4735]: I1001 11:23:15.688313 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9c82t_ea017fc5-1856-47da-99a5-c866738be35e/speaker/0.log" Oct 01 11:23:15 crc kubenswrapper[4735]: I1001 11:23:15.696285 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w6xhg_cfedcee2-b1bb-4709-af40-1d2c309be304/frr/0.log" Oct 01 11:23:29 crc kubenswrapper[4735]: I1001 11:23:29.793015 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/util/0.log" Oct 01 11:23:30 crc kubenswrapper[4735]: I1001 11:23:30.643705 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/util/0.log" Oct 01 11:23:30 crc kubenswrapper[4735]: I1001 11:23:30.662566 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/pull/0.log" Oct 01 11:23:30 crc kubenswrapper[4735]: I1001 11:23:30.731914 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/pull/0.log" Oct 01 11:23:30 crc kubenswrapper[4735]: I1001 11:23:30.895314 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/util/0.log" Oct 01 11:23:30 crc kubenswrapper[4735]: I1001 11:23:30.968854 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/extract/0.log" Oct 01 11:23:30 crc kubenswrapper[4735]: I1001 11:23:30.972383 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcq22dn_b1cb929b-2595-4752-a499-91d2401f5755/pull/0.log" Oct 01 11:23:31 crc kubenswrapper[4735]: I1001 11:23:31.076917 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-utilities/0.log" Oct 01 11:23:31 crc kubenswrapper[4735]: I1001 11:23:31.276629 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-utilities/0.log" Oct 01 11:23:31 crc kubenswrapper[4735]: I1001 11:23:31.286123 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-content/0.log" Oct 01 11:23:31 crc kubenswrapper[4735]: I1001 11:23:31.297145 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-content/0.log" Oct 01 11:23:31 crc kubenswrapper[4735]: I1001 11:23:31.467265 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-utilities/0.log" Oct 01 11:23:31 crc kubenswrapper[4735]: I1001 11:23:31.478598 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/extract-content/0.log" Oct 01 11:23:31 crc kubenswrapper[4735]: I1001 11:23:31.714905 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-utilities/0.log" Oct 01 11:23:31 crc kubenswrapper[4735]: I1001 11:23:31.944972 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-content/0.log" Oct 01 11:23:31 crc kubenswrapper[4735]: I1001 11:23:31.962961 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pxqpm_cd664456-b523-413a-91a4-04d55c466f57/registry-server/0.log" Oct 01 11:23:31 crc kubenswrapper[4735]: I1001 11:23:31.995292 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-content/0.log" Oct 01 11:23:32 crc kubenswrapper[4735]: I1001 11:23:32.043103 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-utilities/0.log" Oct 01 11:23:32 crc kubenswrapper[4735]: I1001 11:23:32.144946 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-content/0.log" Oct 01 11:23:32 crc kubenswrapper[4735]: I1001 11:23:32.159091 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/extract-utilities/0.log" Oct 01 11:23:32 crc kubenswrapper[4735]: I1001 11:23:32.375642 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/util/0.log" Oct 01 11:23:32 crc kubenswrapper[4735]: I1001 11:23:32.553366 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/util/0.log" Oct 01 11:23:32 crc kubenswrapper[4735]: I1001 11:23:32.591962 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/pull/0.log" Oct 01 11:23:32 crc kubenswrapper[4735]: I1001 11:23:32.653194 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/pull/0.log" Oct 01 11:23:32 crc kubenswrapper[4735]: I1001 11:23:32.745530 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/util/0.log" Oct 01 11:23:32 crc kubenswrapper[4735]: I1001 11:23:32.828466 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lmv8p_d0f770d3-02c9-47fb-b650-d515b1c96ea2/registry-server/0.log" Oct 01 11:23:32 crc kubenswrapper[4735]: I1001 11:23:32.863384 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/extract/0.log" Oct 01 11:23:32 crc kubenswrapper[4735]: I1001 11:23:32.868653 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96bjmx2_e8004806-b53f-47ad-928b-1843522489ea/pull/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.038335 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-twfwh_94467536-0aa2-426e-a14a-bb05c8afd56c/marketplace-operator/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.106670 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-utilities/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.250663 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-utilities/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.255679 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-content/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.298666 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-content/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.449187 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-content/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.496712 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/extract-utilities/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.546168 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-utilities/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.606950 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dshz9_0554f347-c661-432f-ad8f-e64550027f55/registry-server/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.706049 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-utilities/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.726554 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-content/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.747886 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-content/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.906282 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-utilities/0.log" Oct 01 11:23:33 crc kubenswrapper[4735]: I1001 11:23:33.911051 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/extract-content/0.log" Oct 01 11:23:34 crc kubenswrapper[4735]: I1001 11:23:34.363719 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h4kg5_921e36dc-85a8-400f-b33f-c5172a57d95b/registry-server/0.log" Oct 01 11:25:28 crc kubenswrapper[4735]: I1001 11:25:28.466596 4735 generic.go:334] "Generic (PLEG): container finished" podID="650ce69a-32a5-4788-aa5e-918fd264ecf7" containerID="c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224" exitCode=0 Oct 01 11:25:28 crc kubenswrapper[4735]: I1001 11:25:28.466698 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hr556/must-gather-c4kx7" event={"ID":"650ce69a-32a5-4788-aa5e-918fd264ecf7","Type":"ContainerDied","Data":"c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224"} Oct 01 11:25:28 crc kubenswrapper[4735]: I1001 11:25:28.468221 4735 scope.go:117] "RemoveContainer" containerID="c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224" Oct 01 11:25:29 crc kubenswrapper[4735]: I1001 11:25:29.059629 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hr556_must-gather-c4kx7_650ce69a-32a5-4788-aa5e-918fd264ecf7/gather/0.log" Oct 01 11:25:35 crc kubenswrapper[4735]: I1001 11:25:35.486358 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:25:35 crc kubenswrapper[4735]: I1001 11:25:35.487043 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:25:40 crc kubenswrapper[4735]: I1001 11:25:40.916689 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hr556/must-gather-c4kx7"] Oct 01 11:25:40 crc kubenswrapper[4735]: I1001 11:25:40.917743 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hr556/must-gather-c4kx7" podUID="650ce69a-32a5-4788-aa5e-918fd264ecf7" containerName="copy" containerID="cri-o://a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce" gracePeriod=2 Oct 01 11:25:40 crc kubenswrapper[4735]: I1001 11:25:40.921164 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hr556/must-gather-c4kx7"] Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.399520 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hr556_must-gather-c4kx7_650ce69a-32a5-4788-aa5e-918fd264ecf7/copy/0.log" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.400576 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/must-gather-c4kx7" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.523457 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/650ce69a-32a5-4788-aa5e-918fd264ecf7-must-gather-output\") pod \"650ce69a-32a5-4788-aa5e-918fd264ecf7\" (UID: \"650ce69a-32a5-4788-aa5e-918fd264ecf7\") " Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.523560 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b2qj\" (UniqueName: \"kubernetes.io/projected/650ce69a-32a5-4788-aa5e-918fd264ecf7-kube-api-access-5b2qj\") pod \"650ce69a-32a5-4788-aa5e-918fd264ecf7\" (UID: \"650ce69a-32a5-4788-aa5e-918fd264ecf7\") " Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.530252 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/650ce69a-32a5-4788-aa5e-918fd264ecf7-kube-api-access-5b2qj" (OuterVolumeSpecName: "kube-api-access-5b2qj") pod "650ce69a-32a5-4788-aa5e-918fd264ecf7" (UID: "650ce69a-32a5-4788-aa5e-918fd264ecf7"). InnerVolumeSpecName "kube-api-access-5b2qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.626127 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b2qj\" (UniqueName: \"kubernetes.io/projected/650ce69a-32a5-4788-aa5e-918fd264ecf7-kube-api-access-5b2qj\") on node \"crc\" DevicePath \"\"" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.630946 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hr556_must-gather-c4kx7_650ce69a-32a5-4788-aa5e-918fd264ecf7/copy/0.log" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.631291 4735 generic.go:334] "Generic (PLEG): container finished" podID="650ce69a-32a5-4788-aa5e-918fd264ecf7" containerID="a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce" exitCode=143 Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.631346 4735 scope.go:117] "RemoveContainer" containerID="a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.631373 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hr556/must-gather-c4kx7" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.665035 4735 scope.go:117] "RemoveContainer" containerID="c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.721792 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/650ce69a-32a5-4788-aa5e-918fd264ecf7-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "650ce69a-32a5-4788-aa5e-918fd264ecf7" (UID: "650ce69a-32a5-4788-aa5e-918fd264ecf7"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.728452 4735 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/650ce69a-32a5-4788-aa5e-918fd264ecf7-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.740380 4735 scope.go:117] "RemoveContainer" containerID="a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce" Oct 01 11:25:41 crc kubenswrapper[4735]: E1001 11:25:41.740922 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce\": container with ID starting with a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce not found: ID does not exist" containerID="a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.740950 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce"} err="failed to get container status \"a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce\": rpc error: code = NotFound desc = could not find container \"a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce\": container with ID starting with a7e861539d9f6ff7072f516af5f6b3ba05361350faba4c1a443d68867a9cc6ce not found: ID does not exist" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.740971 4735 scope.go:117] "RemoveContainer" containerID="c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224" Oct 01 11:25:41 crc kubenswrapper[4735]: E1001 11:25:41.741213 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224\": container with ID starting with c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224 not found: ID does not exist" containerID="c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.741227 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224"} err="failed to get container status \"c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224\": rpc error: code = NotFound desc = could not find container \"c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224\": container with ID starting with c36861d25e228796a3ae72000be80061d940934100f035512d7c55bd2c515224 not found: ID does not exist" Oct 01 11:25:41 crc kubenswrapper[4735]: I1001 11:25:41.915295 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="650ce69a-32a5-4788-aa5e-918fd264ecf7" path="/var/lib/kubelet/pods/650ce69a-32a5-4788-aa5e-918fd264ecf7/volumes" Oct 01 11:26:05 crc kubenswrapper[4735]: I1001 11:26:05.485936 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:26:05 crc kubenswrapper[4735]: I1001 11:26:05.486751 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:26:35 crc kubenswrapper[4735]: I1001 11:26:35.485704 4735 patch_prober.go:28] interesting pod/machine-config-daemon-xgg24 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 11:26:35 crc kubenswrapper[4735]: I1001 11:26:35.487851 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 11:26:35 crc kubenswrapper[4735]: I1001 11:26:35.488054 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" Oct 01 11:26:35 crc kubenswrapper[4735]: I1001 11:26:35.489316 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac26ab6dac2e24f997bbaade511de5f46e894f657c32f435041bb14d4dca5900"} pod="openshift-machine-config-operator/machine-config-daemon-xgg24" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 11:26:35 crc kubenswrapper[4735]: I1001 11:26:35.489609 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" podUID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerName="machine-config-daemon" containerID="cri-o://ac26ab6dac2e24f997bbaade511de5f46e894f657c32f435041bb14d4dca5900" gracePeriod=600 Oct 01 11:26:36 crc kubenswrapper[4735]: I1001 11:26:36.278348 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c2fdbf0-2469-4ca0-8624-d63609123cd1" containerID="ac26ab6dac2e24f997bbaade511de5f46e894f657c32f435041bb14d4dca5900" exitCode=0 Oct 01 11:26:36 crc kubenswrapper[4735]: I1001 11:26:36.278445 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerDied","Data":"ac26ab6dac2e24f997bbaade511de5f46e894f657c32f435041bb14d4dca5900"} Oct 01 11:26:36 crc kubenswrapper[4735]: I1001 11:26:36.278531 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xgg24" event={"ID":"8c2fdbf0-2469-4ca0-8624-d63609123cd1","Type":"ContainerStarted","Data":"196694c5e58caa0ba5a44ce32b7c79e17caccc74fc9e5d3fab6be71de919f7e6"} Oct 01 11:26:36 crc kubenswrapper[4735]: I1001 11:26:36.278562 4735 scope.go:117] "RemoveContainer" containerID="9aadc0db45cd48b358579f41dff851b6673b4158d94163b133ccc844259cfe53"